[T-SQL] Help:Need to run a query against multiple SQL server instances across SQL server version(2000,2005,2008,2008 R2) |
- Help:Need to run a query against multiple SQL server instances across SQL server version(2000,2005,2008,2008 R2)
- Return csv-String from Table
- Merge Statement
- Neea a better performance query
- Need help with a query
- problem with sql reader
- FullName
- How to Group by the Past Seven Days
- Problem with EXCEPT statement
- need urgent help on a calculation
Posted: 13 Aug 2013 10:07 PM PDT Hi,I got a requirement where I need to collect report about SQL server configuration and the report should have columns like SQL server instance name,SQL Server Collation,Edition,location of TempDB,SQL server Version,Number of databases on that SQL instance etc.I need to generate the report querying 300 SQL server instances which are in various SQl server versions like 2000,2005,2008,2008 R2 and in different windows box.Need help on best possible solution and script to generate the report. |
Posted: 13 Aug 2013 06:53 PM PDT Hi,I am looking for a Function/Procedure, which "TableName" as a parameter, and returns each row of the table as a string with ";" as separator and "\ n" is obtained as RowDelimiter (like csv file - 1 Column for each tablerow)Function getCsvFormatFromTable (@ TableName)( @Csv_value RETURNS TABLE (Value NVARCHAR (MAX)) Return @ csv_Value)it also can written in a #Table etc.....ThanksRegards ;-)Nicole |
Posted: 13 Aug 2013 08:53 PM PDT Have a nice lovely Merge statement updating the target table from the source.We capture the updates with a WHEN MATCHED We capture the insertes with a NOT MATCHED by TargetIts the pesky deletes.We may wish to not apply the deletes (just dont ask :exclamation: however we may change our mind again) However we would like to capture the number of potential deletes. Is this possible within the Merge?? I dont think it is.When we panned to do the deleteions we used the output to capture the number iof rows updated, deleted, inserted etc.Many ThanksE |
Neea a better performance query Posted: 13 Aug 2013 08:32 PM PDT hi pros, hope every thing goes well,[code="SQL"]SET STATISTICS IO ONSET STATISTICS TIME ONDECLARE @Tbl AS TABLE ( id INT IDENTITY (1, 1), Mainid INT, Subid INT, NAME VARCHAR (100) )INSERT INTO @Tbl ( Mainid, Subid, NAME )SELECT c.[object_id], c.column_id, c.nameFROM sys.[columns] cORDER BY c.[object_id], c.column_idDECLARE @id INT = 10SELECT *FROM @Tbl tWHERE EXISTS( SELECT 1 FROM @Tbl t2 WHERE t.id < t2.id AND t.Mainid = t2.Mainid AND EXISTS( SELECT 1 FROM @Tbl t3 WHERE t3.id = @id AND t2.Mainid = t3.Mainid ) ) SELECT *FROM @Tbl tWHERE t.id NOT IN (SELECT MAX (t2.id) FROM @Tbl t2 INNER JOIN @Tbl t3 ON t3.Mainid = t2.Mainid WHERE t.Mainid = t2.Mainid AND t3.id = @id) [/code]well my requirement is when i gave a id of a record, i want to get all records of that record's Mainid group except the last record of that grouplet's say [code]id Mid Sid NAME1 3 1 rsid2 3 2 rscolid3 3 3 hbcolid4 3 4 rcmodified[/code]when i gave 1 or 2 or 3 or 4 the expected result would be[code]id Mid Sid NAME1 3 1 rsid2 3 2 rscolid3 3 3 hbcolid[/code]which one is the best way, when i use first query scan count is small but time is little big higher than the second querywhen i use the second query scan count is large but the time is smallif you have any other idea or query please suggest me to the right way |
Posted: 13 Aug 2013 04:34 AM PDT Hi all,I need some help here. I posted in the wrong forum and i appreciated the help received, but i forgot the db was 2008R2. So here goes the issue.DECLARE @Testing AS TABLE (AgentID INT,ExcepCodeDetailName VARCHAR(10),Detail_Start_Time DATETIME,Detail_End_Time DATETIME)INSERT INTO @Testing (AgentID, ExcepCodeDetailName, Detail_Start_Time, Detail_End_Time) VALUES (12345, 'Break', '2013-08-06 09:00:00', '2013-08-06 09:10:00')INSERT INTO @Testing (AgentID, ExcepCodeDetailName, Detail_Start_Time, Detail_End_Time) VALUES (12345, 'Lunch', '2013-08-06 13:00:00', '2013-08-06 14:00:00')SELECT * FROM @Testing-------------------Results12345 Break 2013-08-06 09:00:00.000 2013-08-06 09:10:00.00012345 Lunch 2013-08-06 13:00:00.000 2013-08-06 14:00:00.000-----------Results needed12345 Break 2013-08-06 09:00:00.000 2013-08-06 09:10:00.00012345 Open 2013-08-06 09:10:00.000 2013-08-06 13:00:00.00012345 Lunch 2013-08-06 13:00:00.000 2013-08-06 14:00:00.000 |
Posted: 13 Aug 2013 06:57 AM PDT hi all,I have this rather generic function that executes a stored proc with a bunch of parameters:[code="plain"] protected ArrayList ExecSProcReturnAList(string storedProcName, SqlParameter[] parameters) { ArrayList result = new ArrayList(); using (SqlConnection connection = new SqlConnection(ConnectionString)) { try { connection.Open(); using (SqlCommand command = new SqlCommand(storedProcName, connection)) { command.CommandTimeout = 200; command.CommandType = System.Data.CommandType.StoredProcedure; //add parameters foreach (SqlParameter sqlpar in parameters) { command.Parameters.Add(sqlpar); } SqlDataReader sqldr; sqldr = command.ExecuteReader(); while (sqldr.NextResult() == true) { ArrayList intermediaryResult = new ArrayList(); while (sqldr.Read()) { ArrayList sqldrRow = new ArrayList(); for (int i = 0; i < sqldr.FieldCount; i++) { //Console.Write(sqldr[i].ToString()); sqldrRow.Add(sqldr[i]); } intermediaryResult.Add(sqldrRow); } result.Add(intermediaryResult); } return result; } } finally { if (connection.State == System.Data.ConnectionState.Open) { connection.Dispose(); } } } }[/code]Every time I run this from top to bottom, the result array list is empty. When I put a breakpoint just at the while line, I can see the sql data reader, sqldr does have the expected data in it, however, it fails at the condition ... somehow sqldr.NextResult() == true is false and it jumps out of the loop.Can anybody tell me why?Thanks,kowalsky |
Posted: 13 Aug 2013 02:41 AM PDT How to identify if the first word is the foundation or bank bring full name with t-sql? |
How to Group by the Past Seven Days Posted: 13 Aug 2013 01:38 AM PDT I am familiar with how to group my query results by year, month, week etc. But what is the best way to group my results by the past 7 days for a given timeframe? So my results would be grouped by seven days back from today (8/6/2013) then seven days back from that (7/30/2013) etc. How do I accomplish this? Please advise. |
Posted: 13 Aug 2013 03:12 AM PDT hi, i've a query that's used to build a delta set between two tables and for some reason i'm getting what look like false positives over a particular column. Table A's column stores a varchar value, where Table B's column stores an nvarchar value. I've performed a cast on Table A's varchar value to promote it nvarchar of the same length in table B, and i've checked the content length (i.e. the LEN function) on both columns in both tables to ensure the data is same length. i'm even doing an LTRIM(RTRIM()) on both columns as well. still getting a row back indicating the two values are different. the source data is coming from a database in the UK and the values sometimes contain non alpha-numeric characters. i'm probably wrong, but i'm wondering if some of these characters have different unicode values? like, even though it looks like a duck, it doesn't necessarily mean it's a duck. any thoughts on this? many thanks in advance. ben |
need urgent help on a calculation Posted: 13 Aug 2013 04:47 AM PDT Dear friends,I have to do a calcuation for field Duration using the formula as below-(Temporary Assignment End Date field - Target Start Date field)/30. and I need to Format the result of the formula as numeric, display to two decimal places.the database has the 2 fields in the type and format as -Req# Target Start Date ( DATETIME) example: 2012-08-01 00:00:00.000 Temporary Assignment End Date ( NVARCHAR9255)) example: 09-01-2014Need urgent hlep to solve this using SQL please since I'm new to SQL.kind RegardsDJ |
You are subscribed to email updates from SQLServerCentral / SQL Server 2008 / T-SQL (SS2K8) To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
No comments:
Post a Comment