Friday, June 14, 2013

[SQL Server 2008 issues] Statistics DDL

[SQL Server 2008 issues] Statistics DDL


Statistics DDL

Posted: 13 Jun 2013 07:23 PM PDT

Hi Team,i've a table with statistics, how to view the DDL for the current statistics in Sql server 2008.Please...

SQL Server Full-Text Search: combining proximity term and thesaurus

Posted: 13 Jun 2013 07:11 PM PDT

[b]SCENARIO[/b]I am developing a full-text search over a products table.The full-text indexed field of th table contains a brief description of the product, the name of the category and the product code.I succeed in querying the table using the CONTAINS predicate with NEAR terms:[code="sql"]/* @mySearchTerm is a generic proximity term such as '"hdd*"~"sata*""' */SELECT myFieldsFROM SEARCHABLE_PRODUCTSWHERE CONTAINS(myIndexedField, @mySearchTerm)[/code]Then I try to enhance the search, leveraging on the thesaurus feature in order to enable synonyms search. E.G. 'hard disk', 'hard disk drive', 'hdd'. So I have configured the thesaurus xml file, and I succeed in querying the table using the FORMSOF term.[b]ISSUE[/b]I do not succeed in combining the NEAR and the FORMSOF terms in the CONTAINS predicate, such as:[code="sql"]/* @mySearchTerm is a generic proximity term such as '"hdd*"~"sata*""' */SELECT myFieldsFROM SEARCHABLE_PRODUCTSWHERE CONTAINS(myIndexedField, ' FORMSOF (THESAURUS, @mySearchTerm) ')[/code]A syntax error is returned. As to the [url=http://msdn.microsoft.com/en-us/library/ms187787.aspx]MSDN reference[/url], it seems that it is not possible. Is there any solution or workaround?

Queries that cause full scan table

Posted: 13 Jun 2013 05:35 PM PDT

Hi everybody , I would like to retrieve queries that cause full scan table in sql server.Someone can help me please?? :ermm:

Database mail sending fails with SP_Send_DBmail.

Posted: 03 Jun 2013 05:27 PM PDT

Hi,I am stuck in below problem.I have two servers; ServerA and ServerB. I have linked serverA to serverB bysp_addlinkedserver @server = 'serverB',@provider = 'SQLNCLI',@datasrc = 'ServerB', @srvproduct = '' on serverA. Now I am querying on ServerA and retriveing data from ServerB's Table.The Query is : EXEC msdb.dbo.sp_send_dbmail @recipients = 'myname@myDomain.com',@subject = 'MySubject',@body = 'This Is Test.',@body_format = 'Text', @profile_name = 'Profile1',@query = 'set nocount onSELECT * FROM [ServerB].[Database].dbo.myTable AS MTWHERE DATEADD(DAY,0,DATEDIFF(DAY,0,Date_Col)) = DATEADD(DAY,0,DATEDIFF(DAY,0,GETDATE()))',@execute_query_database = 'master',@query_result_separator = ' ',@attach_query_result_as_file = 1,@query_attachment_filename = 'myFile.csv'I have created an proc on serverA's master database and calling this proc into a job.This job throws an error:" Executed as user: ServerA\sqlAgent. Login failed. The login is from an untrusted domain and cannot be used with Windows authentication. [SQLSTATE 28000] (Error 18452). The step failed. "Can Anybody help in solving this??It would be a great help.Thanks and Regards,Jignesh

creating table within procedure and bulk inserting

Posted: 13 Jun 2013 09:29 AM PDT

Hi ProfessionalsI am creating a procedure in SQL to create a table which i then bulk insert the spreadsheet into the table, the problem I have is that the first 3 columns will always be the same. Software_manufacturer,Product_name and Product_version but the 4th column onwards is unknown. In other words there could be 4 columns or there could be 10 columns. It does not matter what these are called but I need to capture the rest of the columns relating to the spreadsheet that is going to be imported.Is there a way round this, I am writing this in PHP SQL and HTML with a bit of javascripthere is my code[code]USE [TestData]GO/****** Object: StoredProcedure [dbo].[importspreadsheet] Script Date: 06/13/2013 13:38:00 ******/SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOALTER procedure [dbo].[importspreadsheet] @importedfile nvarchar(50), @table_name nvarchar(100) ASBEGIN IF EXISTS ( SELECT * FROM sys.tables JOIN sys.schemas ON sys.tables.schema_id = sys.schemas.schema_id WHERE sys.schemas.name = 'dbo' AND sys.tables.name = 'newtable' ) declare @sql nvarchar(1000),@sqldrop nvarchar(1000)set @sqldrop = 'DROP TABLE dbo.' + quotename(@table_name); set @sql = 'create table dbo.' + quotename(@table_name, '[') + '(software_manufacturer nvarchar(max) null,product_name nvarchar(max) null,product_version nvarchar(max) null,col4 nvarchar(max) null);';exec (@sqldrop)exec (@sql)print (@sql)DECLARE @cmd nvarchar(max)SET @cmd = 'BULK INSERT newtableFROM ''C:\inetpub\wwwroot\uploads\'+ @importedfile +''' WITH ( FIRSTROW = 4,FIELDTERMINATOR = '','',ROWTERMINATOR = '''')'--PRINT @cmd EXEC(@cmd)END[/code]

building up dynamic column creation within procedure

Posted: 13 Jun 2013 11:32 AM PDT

Hi ProfessionalsI have a procedure that passes in 1:CSV file, 2:Table_name 3:Column_countThe column_count can vary and this is where I am trying to build the mid part of my create table statement within the procedure.I am iterating through the records and trying to build however many mid columns there needs to be in the create table statement and cannot figure it out. I am almost there with my code.any ideas professionals, any help would be greatly appreciatedhere is my code with output underneath[code]USE [TestData]GOSET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOALTER procedure [dbo].[importspreadsheet] /*Passed in variables from PHP script*/ @importedfile nvarchar(50), /* The csv filename EG test.csv */ @table_name nvarchar(100), /* The tabke name to be created */ @column_count nvarchar(50) /* The total number of columns from spreadsheet */ASBEGIN declare @sql nvarchar(1000), /* The first part of the query build */ @fullquery nvarchar(1000), /* The full joined query */ @sqldrop nvarchar(1000), /* The drop table command */ @intflag int = 0, /* The loop counter for looping through total number of columns minus 1 */ @midcolumns nvarchar(50) = 'colname', /* The middle columns default name which will have intflag variable appended */ @lastcolumn nvarchar(50); /* The last column to close the Create Table query */--set @sqldrop = 'DROP TABLE dbo.' + quotename(@table_name); /*Building first columns of the create table statement here which are always going to be there*/set @sql = 'create table dbo.' + quotename(@table_name, '[') + '(software_manufacturer nvarchar(max) null,product_name nvarchar(max) null,product_version nvarchar(max) null,';/*Loop through and create the middle columns based on the column_count variable, Make sure to do LESS THAN column_count and NOT LESS THAN OR EQUAL TO so we can close final column outside of loop*/while (@intflag < @column_count) begin --print @intflag set @intflag = @intflag +1; set @midcolumns = 'colname' + quotename(@intflag + 1, '[') + ' nvarchar(max) null,'; print @midcolumns; end/*Create last column in the create table statement*/set @lastcolumn = 'lastcolumn nvarchar(max) null);';/*Join the full create statements together to create dynamic table*/SET @fullquery = (@SQL + @midcolumns + @lastcolumn)print (@fullquery)--exec (@sqldrop)--exec (@sql)END[/code]colname[2] nvarchar(max) null,colname[3] nvarchar(max) null,colname[4] nvarchar(max) null,colname[5] nvarchar(max) null,colname[6] nvarchar(max) null,colname[7] nvarchar(max) null,colname[8] nvarchar(max) null,colname[9] nvarchar(max) null,colname[10] nvarchar(max) null,colname[11] nvarchar(max) null,colname[12] nvarchar(max) null,colname[13] nvarchar(max) null,colname[14] nvarchar(max) null,colname[15] nvarchar(max) null,colname[16] nvarchar(max) null,colname[17] nvarchar(max) null,colname[18] nvarchar(max) null,colname[19] nvarchar(max) null,colname[20] nvarchar(max) null,colname[21] nvarchar(max) null,colname[22] nvarchar(max) null,colname[23] nvarchar(max) null,create table dbo.[newtable](software_manufacturer nvarchar(max) null,product_name nvarchar(max) null,product_version nvarchar(max) null,colname[23] nvarchar(max) null,lastcolumn nvarchar(max) null);

suspect linked server problem ?

Posted: 13 Jun 2013 09:41 AM PDT

Hi I am running the follwoing 2 queries:-- query 1 takes > 20 secondsUPDATE [SERVER-P02].A.dbo.B_TABLEASET Comment = Cast(Cast(237 As Varchar(10))As Ntext)WHERE UserID = '6FE30BDA-A38B-4D0D-B10F-D11C0D324E7883'-- query 2 works ok under 1 secondUPDATE [SERVER-P02].A.dbo.B_TABLEBSET LastActivityDate = '13-jun-2013'WHERE UserID = '6FE30BDA-A38B-4D0D-B10F-D11C0D324E7883'I have run both queries actually on SERVER02-P02 and they both return the same execution plan , that is an INDEX SEEK (NON CLUSTERED) AT 25% cost and a CLUSTERED INDEX UPDATE AT 75% cost.When i run the queries from SERVER4 using the linked server i get 2 totally different execution plan. Query 1 does a massive REMOTE SCAN costing 100% and scans 353008 rows whereas Query 2's execution plan just says" REMOTE Query" and ACTUAL ROWS is 0 on the execution plan.Can anybody tell me how i get query 1 to stop doing the REMOTE SCAN - is this a linked server issue ? The index structure and primary key structure is the same on both tables too.

sql statement containing loop

Posted: 13 Jun 2013 07:16 AM PDT

Trying to accomplish something like the following:Select(SELECT Begin the loop:LTRIM(RTRIM(RIGHT(coalesce(fielda,'') + coalesce(fieldb,'') + coalesce(fieldc,'')),LEN(coalesce(fielda,'') + coalesce(fieldb,'') + coalesce(fieldc,'')),-CHARINDEX('-',(coalesce(fielda,'') + coalesce(fieldb,'') + coalesce(fieldc,''))))))when result = '1a' Then 145when result = '1b' Then 146when result = '2a' Then 147when result = '2b' Then 148End the loopFROM tablename2) as Col1,Col2,Col3From tablename

Few tricky Questions about SQL Server 2008 that I came across...

Posted: 10 Jan 2012 06:31 AM PST

Few Questions about SQL Server 2008 that I came across. [b]Q: Which of the following allows for the truncation of a transaction log?[/b]A. Transaction logs are truncated when you use SQL Profiler.B. Transaction logs are truncated when you use the following syntax: truncate table AC. Transaction logs are truncated when you backup your database.D. You can not backup transaction log.[b]Q: Which of the following statements about unique Constraints are TRUE?[/b]A. You can have only one unique constraint applied to a table.B. Unique constraints can be used to establish a link between two tables.C. You can only have one column assigned to a unique constraint.D. If a primary key was already assigned to a table, the unique constraint would identify a special column used for indexing, [b]Q: Which of the following is an improved performance feature that uses bitmap filters for selecting minimal rows with queries?[/b]A. Table partitionsB. Star JoinC. Where clauseD. Outer Join[b]Q: Using a join in your select statement instead of a where clause is more efficient because the join pulls less data into your working memory.[/b]A. TrueB. False[b]Q: Full text indexes can be backed up and restored with the rest of the database due to the ...[/b]A. ...new index organization systemB. ...overall size of the index getting smaller.C. ...index becoming a part of the transaction logs.D. ...indexes no longer being stored outside of the database.[b]Q. Which of the following statements, if used for creating a new login id, will result in an error?[/b]A. CREATE LOGIN [HumanResources/JohnSmith] FROM WINDOWSB. CREATE LOGIN AndrewWalters WITH PASSWORD '!^%&7', DEFAULT DATABASE = HRC. CREATE LOGIN John WITH PASSWORD '*&!@*&'D. CREATE LOGIN [HumanResources/Jim] FROM WINDOWS, DEFAULT DATABASE=CRM[b]Q. Which of the following statements can be used to provide sorted result set based in the CustomerName column?[/b]A. SELECT ROW_NUMBER() AS RN, CustomerName from Customers order by CustomerNameB. SELECT ROW_NUMBER() OVER (ORDER BY CustomerName) AS RN, CustomerName FROM CustomersC. SELECT ROW_NUMBER() (ORDER BY CustomerName) AS RN, from CustomersD. SELECT ROW_NUMBER() AS RN, CustomerName from Customers

What Queries/SPs Are Running Right Now ?

Posted: 13 Jun 2013 12:47 AM PDT

Is there a way to identify actively running queries or stored procedures ? If I am in Activity Monitor, and click "Details" on a spid, sometimes I can see what an application or user is running.Is there a way to dynamically capture the same information in one place for all connections so I can see all processes hitting SQL ?EDIT: [b]Running SQL 2005[/b], so maybe I posted in the wrong forum.

Dynamically choose source and destination

Posted: 13 Jun 2013 06:45 AM PDT

Hey Gurus, I have just started working with SSIS and have following task at hand and I feel lost. Can any of you please help me?Currently, I have a 2 different data flow tasks that do the same thing: Accept data from a SQL Store proc, put it to flat file destination and archive it. One of them is stored procedure giving us revenue data and the other gives us accounts data. Now, I need to combine both of these different data flows as a single data flow using parameters / variables for picking up appropriate stored procedure based on whether user wants to see account data or revenue data and then, out it in the correct file and file location (Files and file locations are different for revenue and accounts data).I do understand I should use a for-each loop maybe. However, I really need assistance in getting started with how to set up variables (system / user defined) and how can I go about it. I have attached a sample data flow diagram of how I need it. Can you please pass me some tips / ideas. Please note that am a novice is SSIS and just started. Thanks in advance!

Client Statistics: Total exectuon time

Posted: 13 Jun 2013 04:52 AM PDT

Hi allSo, I've got this batch (I hope it is correct to call it a batch, this is just a few variables declarations, queries to assign values to these variables and then just a select statement that returns a data set) open i SSMS that takes 8 sec to execute (this value is taken from the yellow bar under the query window). Now if I turn on "Include Actual Execution Plan" and "Include Client Statistics" and execute the Client Statistics tab pops up and the "Total execution time" has value of 161. Now I always thought this was the time it took the server to execute the hole batch in milliseconds. I don't know if this is relevant or not but the Client processing time is 1 and Wait time on server replies is 161. The data set returned has 210 rows and and 8 columns, so it ain't big.Now, if I the execute the same batch with "Include Actual Execution Plan" turned off the "Total execution time" becomes 7844 which is a value I would have expected in the first place.Can anyone explain what is going on here?I have always used the "Total execution time" when I'm comparing different queries, is this a bad practice, should I be using something else, if then what?I'm running SQL Server 2008 R2 on a Windows Server 2008 R2 Datacenter.Thx in advance

Verify Database was restored

Posted: 13 Jun 2013 05:04 AM PDT

HelloI need to verify that a database was restored from a particular backup.I used RESTORE HEADERONLY on the BAK an can see FirstLSN, LastLSN. Is there a way to match the LSN to the database that was already restored?ThanksDave

Get Notification when a new column added to flat file

Posted: 13 Jun 2013 05:45 AM PDT

I am working on sql server 2005. We have some SSIS packages which loads data from multiple flat files. Flat files comes from other group. Sometimes they just add a new column to the file(remove a column also) and process without letting us know. I just wonder is there any script which catch the changes at source end wheteher its a flat file or a sql table and let us know the change. Appriciate any feedback.

Crosstab query for date

Posted: 13 Jun 2013 05:53 AM PDT

I am learning crosstab query and create a sample. It works fine.Since my real project needs date range is mydate >= (getdate() -15) which will make mydate are different everyday.How to modify code to meet this condition? That is, column's name for mydate are change everyday. create table #TestPivot( MyValue int, MyDate datetime)insert into #TestPivot values(20,'11/11/2012');insert into #TestPivot values(30,'11/12/2012');select * from(SELECT [MyValue] ,[MyDate] FROM #TestPivot) as sourceTablePivot( sum([MyValue]) for [MyDate] in ([2012-11-11],[2012-11-12])) as PivotTableresult2012-11-11 ----- 2012-11-1220---------------30

Get 2 column result from 1 field

Posted: 13 Jun 2013 04:50 AM PDT

Hi geniusesLet say I got a table:[code="sql"]Years | Value2000 | 22001 | 42002 | 52003 | 32004 | 2[/code]I need to get the sum of the Value for all the years in 1 column and the sum(Value) to 2002 in another column.in order to give:[code="sql"]Sumof.All | Sumto.200216 | 11[/code]Thanks in advance!

Phase of Recovery (Log Shipping)

Posted: 13 Jun 2013 03:50 AM PDT

Can anyone elaborate on the phases of recovery or point me to an article, for when you recover a database that has been log shipped? When Googling "Phases of Recovery Log Shipping" (I also tried Googling different variations of that string with no luck), I get a bunch of step-by-step articles on how to set up Log Shipping. I am interested in learning what goes on during that recovery.Thanks for any and all responses...

Import CSV File

Posted: 12 Jun 2013 09:36 PM PDT

Hi,I have a 4 GB CSV file , which i am unable to import in SQL Server 2008 R2, even I not able to open file also. I tried import export wizard also but when i am importing the file , it displaying the records with double quote("").For Ex:- "Charles" "$1.2"Is there any alternate option to import large file into the SQL SERVER or any third party tool which will import the file without double quote.Thanks in advance-------

Login History - who connected to a database?

Posted: 12 Jun 2013 09:22 PM PDT

Hello people,is there a way that we can display all the people who have connected to a certain database over a given time? For instance, we have a read-only user name for people wishing to connect to our live database, in the normal course of our day to day activity. All the users also have a read-write access, but we would like to audit the times that people are connecting with this read-write access - this is independent of any actual changes to the routines that they may make while logged in.Is the information stored anywhere in SQL server management studio, from which we would be able to make a report or similar to show all logins, with time and duration?Thanks,Mark

CLR function not returning results for GetRunningPackages

Posted: 13 Jun 2013 01:07 AM PDT

Hi there,I am working on creating a CLR that can return the list of running SSIS packages on a given server, managed to implement everything, compiled, deployed successfully, ran without errors.... but not results returned, although clearly packages are running on that server (could see then in Integration Services).Moreover when I change the servername to a nonexisting server I get the errors, and if I use the same c# code outside the CLR I get the correct results.Not sure what to try next, feels like a security/permissions issue...Used Application.GetRunningPackages method, SQL 2008R2 Developer edition, .Net 2.0Thanks in advance for your help/feedback.-A

Determine the row of data in a table through SSIS

Posted: 13 Jun 2013 12:52 AM PDT

Hi,I have a requirement based on 2 parts to determine a row of data in a table [b]through SSIS [/b] :a. Work out the LEG_SEQ_NBR of the VESSEL, VOYAGE and LEG already stored in table1.To do this join table1 and join table2 through BL_ID column.b. Work out the 2ndVESSEL, 2ndVOYAGE and 2ndLEG.Once we have identified the LEG_SEQ_NBR of the VESSEL,VOYAGE, LEG already stored in table1 we need to add '1' to this value and then find that LEQ_SEQ_NBR in table2.The DDL of table1 and table2 along with the test data are as below:------------------------------------------------------------CREATE TABLE [dbo].[table1]( [BL_ID] [decimal](10, 0) NOT NULL, [VESSEL] [nvarchar](10) NULL, [VOYAGE] [nvarchar](12) NULL, [LEG] [nchar](3) NULL, [BLNO] [nvarchar](17) NULL ) ON [PRIMARY]CREATE TABLE [dbo].[table2]( [BL_ID] [numeric](10, 0) NULL, [LEG_SEQ_NBR] [numeric](3, 0) NULL, [VESSEL_CD] [varchar](10) NULL, [VOYAGE_CD] [varchar](12) NULL, [LEG_CD] [char](1) NULL, ) ON [PRIMARY]INSERT INTO [table1]VALUES('1','CEUR','032E','E','21')INSERT INTO [table2]VALUES('1','1','CEUR','032E','E')INSERT INTO [table2]VALUES('1','2','MARB','794S','S')---------------------------------------------------------For Example:BLNO = 21 and BL_ID = 1Current Vessel/Voyage/Leg in table1 = CEUR 032E ELEG_SEQ_NBR of this Vessel/Voyage/Leg in table1 = '1'Therefore if we add '1' to this value we'd be looking for LEG_SEQ_NBR '2' as the 2ndVESSEL,2ndVOYAGE,2ndLEG. In this case that would = MARB 794S S Does somebody pls know how this can be worked out through an SSIS job ?Thanks.

My row_number() is misbehaving or I'm an not a smart man

Posted: 12 Jun 2013 10:33 PM PDT

OK I've got some sample setup and data below; basically i need a row_number() i've aliased as [BuildingID], to be a unique value for each value in [ACTV_HOME_PROPERTY_ID] to be generated; if there was two unique values in those five rows, i'd expect values of 1 or 2, but i'm not doing something right today. in the sample data below, there is only ONE unique value, so I expect only the value [1] for all five rows;I know i'm pre-coffee, but i keep getting the same values in row number, no matter what i partition by;[code]CREATE TABLE [dbo].[#MYSAMPLEDATA] ( [ACTNBR] VARCHAR(50) NULL,[ACTNAME] VARCHAR(100) NULL,[RELATEDYEAR] VARCHAR(4) NULL,[IDIS_ACTV_ID] VARCHAR(12) NULL,[ACTV_HOME_PROPERTY_FUND_ID] VARCHAR(15) NULL,[ACTV_HOME_PROPERTY_ID] VARCHAR(15) NULL,[FUND_CAT_ID] VARCHAR(15) NULL)INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315449','900000000105337','122')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000329445','900000000105337','22')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315451','900000000105337','302')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315452','900000000105337','303')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315450','900000000105337','21')--edit: adding five other rowsINSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315449','900000000105338','122')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000329445','900000000105338','22')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315451','900000000105338','302')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315452','900000000105338','303')INSERT INTO [#MYSAMPLEDATA] ([ACTNBR],[ACTNAME],[RELATEDYEAR],[IDIS_ACTV_ID],[ACTV_HOME_PROPERTY_FUND_ID],[ACTV_HOME_PROPERTY_ID],[FUND_CAT_ID])VALUES('2011-03699','RILEY & MERIAL UZEL','2011','3699','900000000315450','900000000105338','21')SELECT 1 As TheExpectedValue, IDIS_ACTV_ID AS ID, ROW_NUMBER() OVER (PARTITION BY IDIS_ACTV_ID ORDER BY IDIS_ACTV_ID,ACTV_HOME_PROPERTY_ID) AS BUILDINGID, ROW_NUMBER() OVER (PARTITION BY IDIS_ACTV_ID,ACTV_HOME_PROPERTY_ID ORDER BY IDIS_ACTV_ID) AS BUILDINGID_V2, ROW_NUMBER() OVER (PARTITION BY IDIS_ACTV_ID,ACTV_HOME_PROPERTY_ID ORDER BY IDIS_ACTV_ID,ACTV_HOME_PROPERTY_ID) AS BUILDINGID_V3, Z.ACTNBR, Z.ACTNAME, Z.RELATEDYEAR, Z.IDIS_ACTV_ID, Z.ACTV_HOME_PROPERTY_FUND_ID, Z.ACTV_HOME_PROPERTY_ID, Z.FUND_CAT_IDFROM #MYSAMPLEDATA Z [/code]

how to caluculate half day

Posted: 12 Jun 2013 10:50 PM PDT

hii want to caluclate half day my actual data is like this[size="1"]date-----------------------------eid---------------timein---------------------------timeout-----------spend-----excessshort------excess2013-03-21 00:00:00.000---26446---2013-06-13 09:13:00.000--2013-06-13 3:46:00.000----06:33:00---- 02:27:00-------Short[/size]i want this data[size="1"]date-----------------------------eid---------------timein---------------------timeout-----------spend-----excessshort-excess----Remarks2013-03-21 00:00:00.000--26446--2013-06-13 09:13:00.000--2013-06-13 3:46:00.000-06:33:00--02:27:00---Short------HALFDAY[/size]employee timing is 9am to 6pm if he leaves from factory at 4pm or before 4 pm then remarks say halfdaybecouse we have 2 hours half day mean if he go for some work and didnt come back for 2 hours our more then 2 hours then half day implementedplease implement on this query[size="1"]drop table #temp1select[date],min([Timein]) as First_Record,sum(DATEDIFF(minute, [Timein], [Timeout])) as Time_Minutesinto #temp1 from attend_LOGwhere eid = 17090group by [date]GOselectt.[date],t.eid,t.[Timein] as timein,t.[Timeout] as timeout,CONVERT(VARCHAR(8), DATEADD(minute, Time_Minutes, 0), 108) AS SpendTime,case when (540 - Time_Minutes) > 0 Then '- ' else '+ ' end+CONVERT(VARCHAR(8), DATEADD(minute, ABS(540 - Time_Minutes), 0), 108) as excesshorttimeFROM attend_LOG tleft join #temp1 t2 on t.[date] = t2.[date] and t.[Timein] = t2.First_Recordwhere eid = 17090order by t.[date], t.[Timein][/size]please help me out thanks for the help

Three months before current month

Posted: 12 Jun 2013 11:48 PM PDT

I need to set [OrderDate] three months before current month.For example, as if June, [OrederDate] should be "Between '1/1/2013' and '03/31/2013'.For July, [OrederDate] should be "Between '1/1/2013' and '04/30/2013'.I want to use code like [OrederDate] Between @datefrom and @datetoI set @datefrom = '01/01/2013' (always), how to set @dateto three months before current month?

Extended Events - Filtering based on sql_text

Posted: 04 Feb 2013 08:50 PM PST

I need to leave a monitor runing long term to capture all calls to procs beginning with 'val_'Extended events seem to be a good fit here, but when adding the sqlserver.sp_statement_starting event, I cant filter on the sql_text column:[code="sql"]CREATE EVENT SESSION CSP_ProcsON SERVER ADD EVENT sqlserver.sp_statement_starting ( ACTION(sqlserver.nt_username, sqlserver.username, sqlserver.client_hostname, sqlserver.sql_text, sqlserver.database_id) WHERE sqlserver.sp_statement_starting.sql_text LIKE '%val%'--this line is underlined in red by the parser )[/code]is it even possible to filter like this, as its pretty simple in profiler/server side tracing

If one of two values exists they take priority over other values, otherwise select random

Posted: 12 Jun 2013 11:26 PM PDT

Hello, I have the below sql statement, where im trying to return a columns value depending on what value is in side.I have a table called orderhandsets which is associated to the table Order each order can have more then 1 row linked to it, so i could have an order 1234and inside orderhandsets i could have 5 rows assigned to order 1234the two values im looking for are 1 = new3 = portwhich are inside the column oh.OrderTypeif 3 exists then i need to return this as its this value takes priority over all other values, if 3 DOES NOT exists i need to check for1 and return it, if neither of them exists i just need to pull the value thats inside the columnthis is my statement[code="sql"]select top(1) oh.OrderTypefrom OrderHandsets oh where oh.OrderId = co.OrderId and (oh.Ordertype = 3 or (oh.OrderType =1 or oh.OrderType not in(1,3)))[/code]see now inside the where clause im first checking for 3, if that doesnt exists then i look for one, otherwise just get any other value, as you can also see im using Top(1) which again is causing the problem because if i have rows like the attached pic it always returns one when i need it to return 3[img]http://http://businessphonesdirect.co.uk/img/logo/snippet.png[/img]Can someone help me achieve what im looking for?So check for 3 first, if that doesnt exists check for 1if none of the above exists then just pull the value from the column

variable with multiple values from SP

Posted: 12 Jun 2013 10:21 PM PDT

hi All,I have a stored procedure where in I pass two paramaters StartDate and EndDate and return the start of the week -------Query-----DECLARE @Sample TABLE (Week Date, TotalSUm int) ---Declare @Testvariable Date---INSERT INTO @Sample---EXEC dbo.sp_testsp '2013-05-02','2013-05-04'---set @Testvariable = (select distinct week from @Sample)-- this returns start of the week i.e--2013-04-29---select @Testvariable --returns the start of week--And then I pass on the @Testvariable in the some other select statement.The above query returs fine, but if the EXEC dbo.sp_testsp is being passed with multiple values, then the error --Msg 512, Level 16, State 1, Line 7Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression.--I understand Set variable can accpet only a Scalar variable. if in case I pass paramaters to procudre as belowEXEC dbo.sp_testsp '2013-05-02','2013-05-07',it should return2013-04-292013-05-06and I wanted to pass the above to select statement and retirve the results for corresponding weeks.ANy help on this? thanks

SQL Agent not able to start on SQL Server 2008 Developer Edition

Posted: 25 Feb 2011 03:28 AM PST

Dear All,I have SQL Server 2008 Developer Edition (64-bit) SP1 installed on a Windows Server 2008 R2 machine. For some reason, I can't get the SQL Agent service to start. When I go into the SQL Server Configuration Manager and select the properties of the Agent service and try to change the start mode from 'Disabled' to 'Manual', it comes back with the following error message after a few seconds:"WMI Provider ErrorThe remote procedure call failed. [0x800706be]"Both SQL Server and SQL Agent service are set to log on as the same domain account, which we have on many other SQL servers and the SQL Server service itself is running under this account quite happily.I've had a look around generally at articles etc... but most of them refer to the Express Edition, which this is not. I've tried temporarily disabling the Windows Firewall, but it seems to make no difference.Does anyone have any ideas? This is a newly created server and the only SQL installation that has ever been on it is the one referred to above. There's nothing else SQL or otherwise that isn't working fine - just the Agent! Could it be a problem with a setting in the O/S?Any help would be much appreciated.Regards,Chris Stride

how to caluculate half day

Posted: 12 Jun 2013 10:46 PM PDT

hii want to caluclate half day my actual data is like this[size="1"]date-----------------------------eid---------------timein---------------------------timeout-----------spend-----excessshort------excess2013-03-21 00:00:00.000---26446---2013-06-13 09:13:00.000--2013-06-13 3:46:00.000----06:33:00---- 02:27:00-------Short[/size]i want this data[size="1"]date-----------------------------eid---------------timein---------------------timeout-----------spend-----excessshort-excess----Remarks2013-03-21 00:00:00.000--26446--2013-06-13 09:13:00.000--2013-06-13 3:46:00.000-06:33:00--02:27:00---Short------HALFDAY[/size]employee timing is 9am to 6pm if he leaves from factory at 4pm or before 4 pm then remarks say halfdaybecouse we have 2 hours half day mean if he go for some work and didnt come back for 2 hours our more then 2 hours then half day implementedplease help me out thanks for the help

Edit data in Excel

Posted: 12 Jun 2013 10:04 PM PDT

Just to let you know Excel Database Tasks (EDT) version 2.7 has been released : New features include :One click saveImmediate row validationRelational data / drop down listsData filteringFor further info :[url=http://leansoftware.net/en-us/help/exceldatabasetasks/introduction.aspx]>>EDT Introduction[/url]In production use now at some pretty major companies including Nissan Motor (UK) & CommerceHub (US)Thanks for all the interest and support from people here..

ORDER BY the total of a GROUP BY

Posted: 12 Jun 2013 08:29 PM PDT

I have a table which contains order information. I need to SUM up the quantities ordered on an account every day. For example i could have the following records in a tableAccount Number QTY Name ROWID123456 4 Example 1123456 6 Example 2Now I have the following query which gets me the total of the quantities for the accountnumber[code="sql"]SELECT AccountNo, SUM(CONVERT(INT,QTY))AS UOMFROM tfrOrderTransmissionDetailLogWHERE DateDiff(dd, [TIMESTAMP],GetDate())=0 GROUP BY AccountNoORDER BY UOM DESC[/code]What I need is all the information in that the above table ordered by which account has ordered the most. With a field at the end giving the total qty ordered. Something like this.Account Number QTY Name ROWID UOM123456 4 Example 1 10123456 6 Example 2 10654321 8 Example 1 17654321 9 Example 2 17Can anyone help me with this or tell me that it isn't possible.Many Thanks in advanceMark D

Thursday, June 13, 2013

[how to] Filtering records not present with specific conditions in another table

[how to] Filtering records not present with specific conditions in another table


Filtering records not present with specific conditions in another table

Posted: 13 Jun 2013 08:18 PM PDT

I have a table (let's call it extra_info) in MySQL that has various extra infomation/labels on user accounts:

account_id / data_key   11 / 'test'   11 / 'blah'   12 / 'coupon a'   12 / 'coupon b'   15 / 'whatever'   18 / 'test'  

And so on.

I am building reports on other tables but need to filter out records that are present in the extra_info table as 'test'.

So in the example above, records 1-10, 13-14, 16-17 are fine because they are not in extra_info, 11 needs to be filtered out (it is 'test'), 12 is fine (but obviously just need to have it once even though is has 2 labels, 15 is fine (not 'test'), and 18 needs to be removed.

I have done this with subquiry in the WHERE clause:

WHERE   -- filter out test labels   a.id NOT IN (     SELECT ei.account_id FROM extra_info AS ei     WHERE ei.data_key = 'test'  )  

This does its job, however, I have a suspicion that a regular join would have worked faster. What's the right syntax for it?

If I just join and filter out the ones with data_key='test', #11 would still be there - so this is not a solution. Grouping by account_id will probably has performance implications and basically same impact as the current code, right?

Any ideas?

How to count number of consecutive events matching status of last event?

Posted: 13 Jun 2013 03:48 PM PDT

In MS-SQL Server 2008, I have a table of events in the following format:

Controller   |  ExecutionTime        |  Result  1            |  2012-09-24 09:00:00  | 0  1            |  2012-09-24 09:01:00  | 0  2            |  2012-09-24 09:02:00  | 1  

I need to analyze this table to produce a result that:

  • Identifies each controller whose latest event was an error (result=1)
  • Identifies how many consecutive failures that controller has had (# events since the most recent successful event)

I've made some progress toward this task, I can identify the controllers whose most recent event was a failure - see this SQL Fiddle. But I could use some help finding an approach to counting the # of consecutive failures that has led to this state.

Thanks! John

Is backing a website by a SQL Server 2012 OLAP cube considered reasonable?

Posted: 13 Jun 2013 04:09 PM PDT

I have been tasked with architecting a solution for a large retail chain. They want to allow each of its 1.2 million customers to log on to a web site to see the distribution of recent purchases (current month, previous month, year-to-date) over about 50 categories. Data will be updated once every day.

I am thinking of putting up a SQL Server 2012 based OLAP cube and letting the website query this cube directly, leveraging features like proactive caching. However, being a developer at heart, I have next to no experience with the analysis services parts of SQL Server, so am quite concerned about the performance of this solution.

Does connecting a web site directly to an OLAP cube sound like a feasible solution? Do such systems react to the load from multiple users roughly like a SQL Server, making this a reasonable solution, or do they act completely differently?

I don't expect users to check their status very often and I will of course be using caching on the webserver etc.

Oracle schema migration to new database with zero downtime

Posted: 13 Jun 2013 01:17 PM PDT

I have a problem: I have two production schemas on one database each serving two different appplicaitons.

I need to migrate (move) one schema to an entirely new database that will be dedicated for that application with zero downtime. SO EXPORT/IMPORT can not be used.

I have license to even active data guard. But if I setup data guard from current database DB to new DB, should I switch over OR failover or what else so that the current primary will continue to be primary in its role and only the new DB will change its role from standby to fully operational primary. Then we simply direct the application to new and start the new db. Thanks for your help.

MySQL import LOCK doesn't prevent PRIMARY key issues

Posted: 13 Jun 2013 03:44 PM PDT

I am using Django and only want to backup mysql data (not it's structure; Django will do that, and it will otherwise mess up Django-South's migration checks).

Some info;

  • my tables are almost all intertwined with each other by Foreign Key bindings.
  • I'm currently having over 100 tables that hold all my data.

What I'm trying to achieve is some mechanism with mysqldump to dump my data, like so;

imac2011:Desktop allendar$ /Applications/MAMP/Library/bin/mysqldump -u root -p my_database --no-create-info > my_database_data.sql  

..and restore it like so;

imac2011:Desktop allendar$ /Applications/MAMP/Library/bin/mysql -u root -p my_database < my_database_data.sql  

..but then with the ability to store the following on the dump (and thus restore it that way on import);

When doing mysqldump

  1. Store the PRIMARY key counters

When importing my_database_data.sql

  1. LOCK the tables fully (even ignoring PRIMARY key counter checks)
  2. INSERT all data (incl. primary keys)
  3. Set the PRIMARY key counter back to what was saved
  4. UNLOCK the tables again

Other issue

My first thought was to just ignore the PRIMARY keys somehow and just look-increment them again on insertion. The problem is that I can't do this due to their bindings with Foreign Keys, that would then thus break.

SQL restore duration

Posted: 13 Jun 2013 12:06 PM PDT

I am trying to determine the duration of backup restore. I executed a few restore command to restore some backups located on network share drive. Here is command and summary,

RESTORE DATABASE [test]  FROM DISK = '\\network\test.bak'  WITH      MOVE 'test_data' TO 'D:\test_data.mdf',      MOVE 'test_log' TO 'E:\test_log.ldf',      CHECKSUM, STATS;  

Backup A
1st restore take about 5 min. Ok, after restore complete, I delete the database
2nd restore take about 1.8 min. Hm.. Let's delete the database and try again.
3rd restore take about 1.5 min. Hm..

Backup B (almost same size as backup A, at the same network location)
1st restore take about 1.8 min. I deleted it and try again
2nd restore take about 1.5 min. Ok..

The service account is enabled for instant file initialization.

My question is why the restore duration varied on different restore attempt on the same backup file to the same machine? Is it purely on network throughput (perhaps someone was doing something on the network share and stuff), or something else like cache or something in SQL internal?

SQL Developer: how to script output to query results grid

Posted: 13 Jun 2013 11:55 AM PDT

I am used to MS SQL Server Studio where I can DECLARE and SET variables and use them in my query, and then be presented with the results in a datagrid that I can copy and paste into Excel.

In Oracle SQL Developer, I can get simple query results returned in the 'Query Results' grid, but if I need to script something more complex (with variables), I need to use the 'Run Script' option and my results are ouput to the 'Script Output' window, where it's much harder to select the results for copying into Excel.

Here is example code, where the variable :v_yr is used three times:

var v_yr number;  exec :v_yr := 2014;    select job.employee_no,  mas.last_name,  mas.first_name,  pay.contract_salary,  pay.contract_days_actual,  CASE job.contract_year WHEN :v_yr THEN cal.contract_days_actual END,  pay.full_year_salary,  pay.full_year_contract_days,  CASE job.contract_year WHEN :v_yr THEN sal.annual_rate END  from hrs.employee_jobs job  LEFT OUTER Join hrs.employee_master mas on job.employee_no = mas.employee_no  LEFT OUTER JOIN hrs.employee_pays pay on job.employee_no = pay.employee_no AND      job.job_seq_no = pay.job_seq_no  LEFT OUTER JOIN hrs.schedule_amounts sal on pay.schedule_id = sal.schedule_id AND      pay.schedule_no = sal.schedule_no AND      pay.schedule_level = sal.schedule_level AND      pay.schedule_step = sal.schedule_step  LEFT OUTER JOIN hrs.calendar_master cal on pay.calendar = cal.calendar  where job.contract_year in (2013,:v_yr);  

In my Googling so far, it seems that in order to display these results in a grid, I will need to set up this query as some kind of stored procedure, assign the results to a cursor. I don't know enough about Oracle programming to understand this answer yet.

Insert performace with Geography column

Posted: 13 Jun 2013 09:34 AM PDT

I've been tasked with inserting data into a SQL Server table with a geography column. I've found that my times for doing inserts (same data 1.5 million rows) go up increasingly.

I started out with no geography column and it took 6 minutes, then I added a geography column and it took 20 minutes (again same data). Then I added a spatial index and it took 1 hour and 45 minutes.

I'm new at anything spatial, but this seems like really bad performance. Is there anything I can do to help speed this up or is this just the performance I'm going to see when dealing with SQL Spatial?

I can't get MySQL56 service to start after i stopped it

Posted: 13 Jun 2013 10:18 AM PDT

I am having an issue on my server. I am running MySql on Windows Server 2008 R2. I have done some changes to my.ini file so I had to restart the MySQL for the changes to take effect. and when I went to services and tried to start MySQL56 windows gives me this error

Windows could not start the MySQL56 service on Local Computer. Error 1067: tHe process terminated Unexpetedlly.

I tried rebooting my server and that did not work. I restored the changes that I made to my.ini and that did not work.

What can I do to bring mysql back up?

Thanks

count(PK) yields different result than left join ... is null

Posted: 13 Jun 2013 08:14 AM PDT

I have a really strange behaviour here. Situation is, I added rows to my table store_inventory. Before I did this, I made a backup of the table in store_inventory_old.

Now I wanted to make sure everything is okay and queried

SELECT 'old', count(product) FROM store_inventory_old  union all  SELECT 'new', count(product) FROM store_inventory  

The difference between the two count(product) values I receive from this query is about 2000, which was too few for me.

When I do

select count(new.product) from store_inventory new  left join store_inventory_old old on new.product = old.product  where old.product is null  

I get about 6000, which is okay. The column product is the primary key in both tables.

Question is, how is it possible that those two queries deliver different results?

Thanks in advance.

LDAP in SQL Server 2012

Posted: 13 Jun 2013 04:20 PM PDT

In our current system, we query the LDAP using SQL Server 2008R2 but are moving to SQL Server 2012 where we have hit a snag. According to http://msdn.microsoft.com/en-us/library/cc707782.aspx one of the discontinued features of SQL Server 2012 is the AD helper service.

Does anyone know of a way to query the AD from SQL Server 2012?

Recommended distributor architecture for transactional replication?

Posted: 13 Jun 2013 09:59 AM PDT

We are planning to create a Transactional replication on 2 nodes of SQL Server 2008 R2, each node will be a publisher and a subscriber.

What is the best practice in order to have this scenario done? Microsoft always recommends that the distributor should be independent from the publisher or the subscriber, so it is obvious that a 3rd node is to be involved in the scenario. But, as each node will be at the same time publisher and subscriber, can the same (3rd node) be the distributor for the 2 publishers? Should I create 2 distributors, or can it still be on the same architecture, using only 2 nodes, each one acting as distributor as well as publisher?

SQL Server not giving back memory - staying at Max Server Memory setting

Posted: 13 Jun 2013 10:41 AM PDT

I'm the only one working on this server and followed documentation about optimizing SQL memory usage when I saw that SQL was using all of the memory available of the server.

One of the steps I did is to fix the max memory size to 20 Gb (server for test purposes with 32 Gb memory).

After tuning, yesterday, SQL Server was only using 3-4 Gb but today, it has again reached the max memory usage I allocated.

Why is SQL Server keeping all 20 Gb? What will happen when users will begin to use the server if the max memory size is already used?

Shrink transaction log while transaction in progress

Posted: 13 Jun 2013 03:35 PM PDT

I have a several-GB import into my SQL Server 2005 (Standard Edition) database.

The import is separated into 4 parts, each part in its own transaction.

As you can guess, the transaction log files are quite big. The space on my disk is also limited, but should fit the data. While doing my import the free space on my disk is about 2 gb.

Does it work to shrink the files, while my transactions are in progress? Would it also have an effect?

How do I remove partitioning?

Posted: 13 Jun 2013 04:28 PM PDT

There is a table my_table which has not been created any partition in MySQL 5.6. And I'm trying to create partitions by procedure in my_table, when I run this code:

alter table my_table remove partitioning  

I'm getting this error:

1505 - Partition management on a not partitioned table is not possible  

But when I run above code in MySQL version 5.5, this code can be run.

If my_table has been created partitions, and when I run above code, not only MySQL 5.5 can run, but also MySQL 5.6 can run too.

I guess that if this code can be run, MySQL 5.6 need to be config some files.But I am not able to why.

Can you solve it?

Best practice for storing record metadata

Posted: 13 Jun 2013 04:26 PM PDT

What is the best practice for storing metadata of individual records in a database?

I need to store common meta data such as creation time and time of last update for many tables in my database. I found a few different solutions:

  1. Store the meta data directly in the tables.

    Pros:

    • Meta data is directly linked to records
    • No joins are required to retrieve meta data

    Cons:

    • A lot of duplicate columns are required (unless inheritance is used)
    • Meta data and business data are not separated
  2. Create a general meta data table with and use soft foreign keys to link data to the correct tables and records.

    Pros:

    • No duplication of columns
    • Meta data is separated from business data

    Cons:

    • No direct links between meta data and data (FK's can't be used)
    • Joins require an additional condition
  3. Create individual meta data tables for each table requiring meta data.

    Pros:

    • Meta data is directly linked to records
    • Meta data is separated from business data

    Cons:

    • A lot of extra tables are required
    • A lot of duplicate columns are required (unless inheritance is used)

Are there more options, pros or cons than the ones I mentioned here? And what is the best practice for storing this meta data?

how to view content of Sybase procedures?

Posted: 13 Jun 2013 09:42 AM PDT

I would like to view the code of my Sybase stored procedures.

I recognize that show procedures gives me the list of procedures.

Can anyone instruct me on how to get the actual code for the stored procedures?

How can I model this problem in a graph database?

Posted: 13 Jun 2013 11:26 AM PDT

I have a project that I'm working on where I extract data from PDFs and map/visualize the relationships between the extracted pieces.

Here's an example of my problem:

file: 11425646.pdf    author: bob    company: abc co    date: 1/1/2011    mentioned_users: [alice,sue,mike,sally]    images: [1958.jpg,535.jpg,35735.jpg]    file: 15421484.pdf    author: betty    company: ionga    date: 2/15/2011    mentioned_users: [john,alex,george]    images: [819.jpg,9841.jpg,78.jpg]    file: 11975748.pdf    author: micah    company: zoobi    date: 9/26/2011    mentioned_users: [alice,chris,joe]    images: [526.jpg,5835.jpg,355.jpg]  

How can I model this in a graph database like Neo4j?

I would like to be able to be given one piece of data (like a person's name) and find all related (images, co-mentions, authors, etc.) at up to 10 depth. Here's what I'm thinking for the structure, but I'm not sure if it's a good approach: (this isn't any kind of actual syntax)

[file: 11425646.pdf date:1/1/2011] -written_by-> bob  [file: 11425646.pdf date:1/1/2011] -from_company-> abc co  [file: 11425646.pdf date:1/1/2011] -mentions-> alice  [file: 11425646.pdf date:1/1/2011] -mentions-> sue  [file: 11425646.pdf date:1/1/2011] -mentions-> mike  [file: 11425646.pdf date:1/1/2011] -mentions-> sally  [file: 11425646.pdf date:1/1/2011] -has_image-> 1958.jpg  [file: 11425646.pdf date:1/1/2011] -has_image-> 535.jpg  [file: 11425646.pdf date:1/1/2011] -has_image-> 35735.jpg  

Is this the right way to structure this data in a graph database?

pg_dump 9.2.x command does not work with pg_dump 9.2.3

Posted: 13 Jun 2013 01:21 PM PDT

This is how i used to create all the time backups of my databases with PostgreSQL 9.2.x:

pg_dump -Fc -Z 6 -U postgres mydatabase > 2013-xx-xx_xxxxx.db  

However on a machine where PostgreSQL with version 9.2.3 is running I get all the time the error "too many command line arguments" starting with "-Z". If I remove -Z he is complaining about "-U"? What is wrong?


Okay there is definitely sth. wrong. I removed argument for argument until i just started pg_dump. Then i added argument by argument and it worked. So I opened another command line window and tried the same: First "too many arguments", after calling only "pg_dump" and adding further arguments it worked...

Primary replica set server goes secondary after secondary fails

Posted: 13 Jun 2013 03:27 PM PDT

I have a 2 servers replica set that, after the secondary fails the primary goes into secondary mode while the secondary is in STARTUP2 (recovering). The problem with this is that I can't use the collection stored in that replica set freely, I'm getting errors trying to use the collection:

pymongo.errors.OperationFailure: database error: ReplicaSetMonitor no master found for set: rs2  

Sometimes if I restart the mongod instances, the server rs2-1 is the primary for a while, but after some time (while the secondary is recovering) I see this in the logs of rs2-1 (the primary):

Tue May  7 17:43:40.677 [rsHealthPoll] replSet member XXX.XXX.XXX.XXX:27017 is now in state DOWN  Tue May  7 17:43:40.677 [rsMgr] can't see a majority of the set, relinquishing primary  Tue May  7 17:43:40.682 [rsMgr] replSet relinquishing primary state  Tue May  7 17:43:40.682 [rsMgr] replSet SECONDARY  Tue May  7 17:43:40.682 [rsMgr] replSet closing client sockets after relinquishing primary  

Is there an easy way to make the primary keep being primary after the secondary fails? Am I doing something wrong?

Thanks in advance!

speeding up a query on MySql

Posted: 13 Jun 2013 01:27 PM PDT

I have a table with more than 10 million rows and 10 fields(columns). There is an index on field_1 and I am running the following query.

create table t2   select field_1,         sum(ifnull(field_2,0)) as field_2,          sum(ifnull(field_3,0)) as field_3,         sum(ifnull(field_4,0)) as field_4   from t1   group by field1;  

The data type of all the columns is varchar(200).

This query is not able to produce the result even after running for more than 1 day. Any suggestions on getting the results quickly will be helpful.

Newly discovered SQL Server Express has all databases inaccessible/offline

Posted: 13 Jun 2013 07:27 PM PDT

We have several remotes sites at my company and today one of the employees came to me and said their SQL reporting tool stopped working. Up until this point I did not even know this server or database or reporting service existed!

I have RDP access to the server and can open SQL Server Management Studio 2008 R2 and the databases appear to be SQL Server Express. Currently I can login to the instance with my domain admin account but every database gives me the following message when I try to expand it:

enter image description here The webpage they go to for the reports gives them the following error:

•   An error has occurred during report processing. (rsProcessingAborted)      o   Cannot create a connection to data source 'DNPDataSource'. (rsErrorOpeningConnection)          ?   For more information about this error navigate to the report server on the local server machine, or enable remote errors  

When I try to view the SQL Server Error log I get this error:

enter image description here

I do not have the sa password. I'm not a DBA but need to try and figure this one out, can anyone point me in a direction to start troubleshooting this? I'm completely lost.


here is the ERROR.LOG

2013-03-27 13:14:24.34 Server      Microsoft SQL Server 2008 R2 (SP1) - 10.50.2550.0 (X64)       Jun 11 2012 16:41:53       Copyright (c) Microsoft Corporation      Express Edition with Advanced Services (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor)    2013-03-27 13:14:24.34 Server      (c) Microsoft Corporation.  2013-03-27 13:14:24.34 Server      All rights reserved.  2013-03-27 13:14:24.34 Server      Server process ID is 9040.  2013-03-27 13:14:24.34 Server      System Manufacturer: 'Intel Corporation', System Model: 'S5520UR'.  2013-03-27 13:14:24.34 Server      Authentication mode is MIXED.  2013-03-27 13:14:24.34 Server      Logging SQL Server messages in file 'c:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\Log\ERRORLOG'.  2013-03-27 13:14:24.34 Server      This instance of SQL Server last reported using a process ID of 2428 at 3/27/2013 1:14:02 PM (local) 3/27/2013 7:14:02 PM (UTC). This is an informational message only; no user action is required.  2013-03-27 13:14:24.34 Server      Registry startup parameters:        -d c:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\master.mdf       -e c:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\Log\ERRORLOG       -l c:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\mastlog.ldf  2013-03-27 13:14:24.37 Server      SQL Server is starting at normal priority base (=7). This is an informational message only. No user action is required.  2013-03-27 13:14:24.37 Server      Detected 16 CPUs. This is an informational message; no user action is required.  2013-03-27 13:14:24.51 Server      Using dynamic lock allocation.  Initial allocation of 2500 Lock blocks and 5000 Lock Owner blocks per node.  This is an informational message only.  No user action is required.  2013-03-27 13:14:24.51 Server      Lock partitioning is enabled.  This is an informational message only. No user action is required.  2013-03-27 13:14:24.56 Server      Node configuration: node 0: CPU mask: 0x00000000000000ff:0 Active CPU mask: 0x00000000000000ff:0. This message provides a description of the NUMA configuration for this computer. This is an informational message only. No user action is required.  2013-03-27 13:14:24.62 spid7s      Starting up database 'master'.  2013-03-27 13:14:24.69 spid7s      2 transactions rolled forward in database 'master' (1). This is an informational message only. No user action is required.  2013-03-27 13:14:24.69 spid7s      0 transactions rolled back in database 'master' (1). This is an informational message only. No user action is required.  2013-03-27 13:14:24.69 spid7s      Recovery is writing a checkpoint in database 'master' (1). This is an informational message only. No user action is required.  2013-03-27 13:14:24.78 spid7s      FILESTREAM: effective level = 0, configured level = 0, file system access share name = 'SQLEXPRESS'.  2013-03-27 13:14:24.83 spid7s      SQL Trace ID 1 was started by login "sa".  2013-03-27 13:14:24.85 spid7s      Starting up database 'mssqlsystemresource'.  2013-03-27 13:14:24.87 spid7s      The resource database build version is 10.50.2500. This is an informational message only. No user action is required.  2013-03-27 13:14:25.09 spid10s     Starting up database 'model'.  2013-03-27 13:14:25.09 spid7s      Server name is 'WCCKEMAPP\SQLEXPRESS'. This is an informational message only. No user action is required.  2013-03-27 13:14:25.21 spid10s     The tail of the log for database model is being rewritten to match the new sector size of 4096 bytes.  2560 bytes at offset 99840 in file c:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\modellog.ldf will be written.  2013-03-27 13:14:25.31 spid10s     Clearing tempdb database.  2013-03-27 13:14:25.32 spid13s     A new instance of the full-text filter daemon host process has been successfully started.  2013-03-27 13:14:25.37 spid7s      Starting up database 'msdb'.  2013-03-27 13:14:25.40 Server      A self-generated certificate was successfully loaded for encryption.  2013-03-27 13:14:25.40 Server      Server is listening on [ 'any' <ipv6> 54547].  2013-03-27 13:14:25.40 Server      Server is listening on [ 'any' <ipv4> 54547].  2013-03-27 13:14:25.40 Server      Server local connection provider is ready to accept connection on [ \\.\pipe\SQLLocal\SQLEXPRESS ].  2013-03-27 13:14:25.40 Server      Server named pipe provider is ready to accept connection on [ \\.\pipe\MSSQL$SQLEXPRESS\sql\query ].  2013-03-27 13:14:25.40 Server      Dedicated administrator connection support was not started because it is disabled on this edition of SQL Server. If you want to use a dedicated administrator connection, restart SQL Server using the trace flag 7806. This is an informational message only. No user action is required.  2013-03-27 13:14:25.56 spid10s     Starting up database 'tempdb'.  2013-03-27 13:14:25.60 spid13s     The Service Broker protocol transport is disabled or not configured.  2013-03-27 13:14:25.60 spid13s     The Database Mirroring protocol transport is disabled or not configured.  2013-03-27 13:14:25.61 spid13s     Service Broker manager has started.  2013-03-27 13:14:25.77 spid7s      The tail of the log for database msdb is being rewritten to match the new sector size of 4096 bytes.  2048 bytes at offset 12007424 in file c:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\MSDBLog.ldf will be written.  2013-03-27 13:14:25.84 spid7s      Recovery is complete. This is an informational message only. No user action is required.  2013-03-27 13:14:25.90 Logon       Error: 17187, Severity: 16, State: 1.  2013-03-27 13:14:25.90 Logon       SQL Server is not ready to accept new client connections. Wait a few minutes before trying again. If you have access to the error log, look for the informational message that indicates that SQL Server is ready before trying to connect again.  [CLIENT: ::1]  2013-03-27 13:14:25.90 Logon       Error: 17187, Severity: 16, State: 1.  2013-03-27 13:14:25.90 Logon       SQL Server is not ready to accept new client connections. Wait a few minutes before trying again. If you have access to the error log, look for the informational message that indicates that SQL Server is ready before trying to connect again.  [CLIENT: 172.17.0.210]  2013-03-27 13:14:26.48 Server      The SQL Server Network Interface library successfully registered the Service Principal Name (SPN) [ MSSQLSvc/WCCKemAPP.WCC.LOCAL:SQLEXPRESS ] for the SQL Server service.   2013-03-27 13:14:26.48 Server      The SQL Server Network Interface library successfully registered the Service Principal Name (SPN) [ MSSQLSvc/WCCKemAPP.WCC.LOCAL:54547 ] for the SQL Server service.   2013-03-27 13:14:26.48 Server      SQL Server is now ready for client connections. This is an informational message; no user action is required.  2013-03-27 13:14:31.04 spid51      Starting up database 'ReportServer'.  2013-03-27 13:14:31.37 spid51      Starting up database 'ReportServerTempDB'.  2013-03-27 13:14:31.76 spid51      Starting up database 'ReportServerTempDB'.  2013-03-27 13:14:32.07 spid51      Starting up database 'ReportServerTempDB'.  2013-03-27 13:14:32.24 Logon       Error: 18456, Severity: 14, State: 5.  2013-03-27 13:14:32.24 Logon       Login failed for user 'drily'. Reason: Could not find a login matching the name provided. [CLIENT: 172.17.0.210]  2013-03-27 13:15:12.28 spid55      Starting up database 'DNP'.  2013-03-27 13:15:13.75 spid55      Starting up database 'DSS'.  2013-03-27 13:19:36.62 spid57      Starting up database 'ReportServerTempDB'.  2013-03-27 13:25:31.18 spid53      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 13:25:36.38 spid53      Starting up database 'DSSDL'.  2013-03-27 13:25:38.89 spid53      Starting up database 'DSSUSERDIR'.  2013-03-27 13:25:41.26 spid53      Starting up database 'permissionsAudit'.  2013-03-27 13:25:45.00 spid53      Starting up database 'PMKemmererProduction'.  2013-03-27 13:25:48.05 spid53      Starting up database 'PMKemmererProductionTEST'.  2013-03-27 13:26:01.57 spid54      Attempting to load library 'xpstar.dll' into memory. This is an informational message only. No user action is required.  2013-03-27 13:26:01.58 spid54      Using 'xpstar.dll' version '2009.100.1600' to execute extended stored procedure 'xp_instance_regread'. This is an informational message only; no user action is required.  2013-03-27 13:26:52.10 Logon       Error: 18456, Severity: 14, State: 38.  2013-03-27 13:26:52.10 Logon       Login failed for user 'WCC\baadmin'. Reason: Failed to open the explicitly specified database. [CLIENT: <local machine>]  2013-03-27 13:26:53.37 spid59      Starting up database 'DSSDL'.  2013-03-27 13:26:53.60 spid59      Starting up database 'DSSUSERDIR'.  2013-03-27 13:26:53.92 spid59      Starting up database 'QuietDose'.  2013-03-27 13:26:54.16 spid59      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 13:26:54.36 spid59      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 13:26:54.66 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 13:26:54.89 spid59      Starting up database 'STX'.  2013-03-27 13:26:55.57 spid59      Starting up database 'Test'.  2013-03-27 13:26:55.76 spid59      Starting up database 'DSSDL'.  2013-03-27 13:26:55.91 spid59      Starting up database 'DSSUSERDIR'.  2013-03-27 13:26:56.08 spid59      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 13:26:56.31 spid59      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 13:26:56.52 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 13:26:56.68 spid59      Starting up database 'STX'.  2013-03-27 13:26:57.24 spid59      Starting up database 'DSSDL'.  2013-03-27 13:26:57.28 spid59      Starting up database 'DSSUSERDIR'.  2013-03-27 13:26:57.45 spid59      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 13:26:57.55 spid59      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 13:26:57.74 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 13:26:57.83 spid59      Starting up database 'STX'.  2013-03-27 13:29:36.55 spid54      Starting up database 'ReportServerTempDB'.  2013-03-27 13:39:36.57 spid56      Starting up database 'ReportServerTempDB'.  2013-03-27 13:41:59.55 Logon       Error: 18456, Severity: 14, State: 8.  2013-03-27 13:41:59.55 Logon       Login failed for user 'sa'. Reason: Password did not match that for the login provided. [CLIENT: <local machine>]  2013-03-27 13:44:07.70 Logon       Error: 18456, Severity: 14, State: 8.  2013-03-27 13:44:07.70 Logon       Login failed for user 'sa'. Reason: Password did not match that for the login provided. [CLIENT: <local machine>]  2013-03-27 13:49:36.57 spid53      Starting up database 'ReportServerTempDB'.  2013-03-27 13:59:36.57 spid54      Starting up database 'ReportServerTempDB'.  2013-03-27 14:09:36.56 spid53      Starting up database 'ReportServerTempDB'.  2013-03-27 14:15:50.50 spid54      Starting up database 'DSSDL'.  2013-03-27 14:15:50.75 spid54      Starting up database 'DSSUSERDIR'.  2013-03-27 14:15:51.92 spid54      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 14:15:52.25 spid54      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 14:15:52.51 spid54      Starting up database 'ReportServerTempDB'.  2013-03-27 14:15:52.70 spid54      Starting up database 'STX'.  2013-03-27 14:18:02.83 spid51      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 14:18:46.58 Logon       Error: 18456, Severity: 14, State: 38.  2013-03-27 14:18:46.58 Logon       Login failed for user 'WCC\baadmin'. Reason: Failed to open the explicitly specified database. [CLIENT: <local machine>]  2013-03-27 14:18:47.49 spid59      Starting up database 'DSSDL'.  2013-03-27 14:18:47.70 spid59      Starting up database 'DSSUSERDIR'.  2013-03-27 14:18:47.92 spid59      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 14:18:48.04 spid59      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 14:18:48.33 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 14:18:48.53 spid59      Starting up database 'STX'.  2013-03-27 14:18:49.12 spid59      Starting up database 'DSSDL'.  2013-03-27 14:18:49.33 spid59      Starting up database 'DSSUSERDIR'.  2013-03-27 14:18:49.44 spid59      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 14:18:49.60 spid59      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 14:18:49.84 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 14:18:49.98 spid59      Starting up database 'STX'.  2013-03-27 14:18:50.28 spid59      Starting up database 'DSSDL'.  2013-03-27 14:18:50.39 spid59      Starting up database 'DSSUSERDIR'.  2013-03-27 14:18:50.48 spid59      Starting up database 'ReportServer$SQLExpress'.  2013-03-27 14:18:50.53 spid59      Starting up database 'ReportServer$SQLEXPRESSTempDB'.  2013-03-27 14:18:50.66 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 14:18:50.73 spid59      Starting up database 'STX'.  2013-03-27 14:19:36.54 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 14:19:36.93 spid59      Starting up database 'ReportServerTempDB'.  2013-03-27 14:29:36.55 spid53      Starting up database 'ReportServerTempDB'.  2013-03-27 14:39:36.57 spid54      Starting up database 'ReportServerTempDB'.  2013-03-27 14:49:36.57 spid53      Starting up database 'ReportServerTempDB'.  2013-03-27 14:59:36.58 spid54      Starting up database 'ReportServerTempDB'.  2013-03-27 15:09:36.60 spid53      Starting up database 'ReportServerTempDB'.  

I can provide older logs too if it will be beneficial. I'm reading those now from here.

Results from query select name, state_desc, is_in_standby, is_cleanly_shutdown from sys.databases;

enter image description here


Image for Kin's request

enter image description here

MySQL backup InnoDB

Posted: 13 Jun 2013 12:48 PM PDT

I have a VoIP server running 24x7. At low peak hour at lease 150+ users are connected. My server has MySQL running with InnoDB engine on Windows 2008 platform. I like to take at least 2 times full database backup without shutting down my service.

As per Peter Zaitsev - the founder of percona, mysqldump –single-transaction is not always good.

read here if you are interested

As I'm not a DBA, I like to know in my scenario, which would be best solution to take a database backup?

Thanks,

SQL Server 2008 R2 replication high delivery latency

Posted: 13 Jun 2013 02:27 PM PDT

I am seeing an unusually high delivery latency between our distributor and subscribers and i do not understand why.

We have in this configuration 3 sql servers using transactional push replication to replicate data from one master server to two reporting servers.

We have 9 publications. The distribution agent for most publications are showing under 5ms but one is show as 2000+ms to both subscribers.

The suspect publication has only 4 small articles (tables) that rarely, if ever, change. Ive checked and each table has an primary key.

ive also checked the @status parameter for each article according to the MS KB: The distribution agent may experience high latency when you configure transactional replication with articles that are configured not to replicate changes as parameterized statements

Im tempted to start droping articles to find out if one particular table is the culprit.

Doe anyone have any suggestions as to what I can look at?

Strange characters in mysqlbinlog output

Posted: 13 Jun 2013 10:27 AM PDT

Has anyone experienced this? Data replicates fine but when output in mysqlbinlog there are hidden characters that break the input?

  • mysqlbinlog Ver 3.3 for Linux at x86_64
  • mysql 5.5.28 server

Thanks! Julie

Connecting to a SQL Server database from a Flash program

Posted: 13 Jun 2013 11:27 AM PDT

I currently have the ability to utilize Microsoft SQL Server 2012. I am developing a project with Adobe Flash Builder 4.7.

If I link my database with Adobe Flash Builder is there any additional steps I must take in order to make the database live, or as long as my computer is running will this database be accessible from any device that is utilizing it?

In other words is this a LAN only system or does it automatically make itself available for the programs I link to it?

Oracle Express edition on Ubuntu - control file missing

Posted: 13 Jun 2013 04:27 PM PDT

I have installed the Oracle Express edition on Ubuntu as mentioned here.

I am facing issues when I try to create a sample table.

Started oracle

$ sudo service oracle-xe start   Starting Oracle Database 11g Express Edition instance.  

Started sqlplus

$ sqlplus / as sysdba  

Executed the CREATE command

SQL> CREATE TABLE SAMPLE (ID NUMBER);  CREATE TABLE SAMPLE (ID NUMBER)  *** ERROR at line 1: ORA-01109: database not open**  

After a series of research on web, I tried to shutdown and restart oracle:

Shutdown command

SQL> shutdown  ORA-01507: database not mounted  ORACLE instance shut down.  

Started the oracle instance

SQL> startup    ORACLE instance started.  Total System Global Area  688959488 bytes Fixed Size                   2229688 bytes Variable Size             411044424 bytes Database  Buffers          272629760 bytes Redo Buffers                3055616  bytes ORA-00205: error in identifying control file, check alert log  for more info  

I realized that the control file is missing at /u01/app/oracle/oradata/XE. XE is the DB name.

So I tried to create the control file as follows:

SQL> CREATE CONTROlFILE SET DATABASE XE RESETLOGS;    Control file created.  

Tried to create the sample table again

SQL> CREATE TABLE SAMPLE(ID NUMBER);      CREATE TABLE SAMPLE(ID NUMBER)    ERROR at line 1: ORA-01109: database not open  

So I tried to issue the following command

SQL> ALTER DATABASE OPEN RESETLOGS;  ALTER DATABASE OPEN RESETLOGS    ERROR at line 1:  ORA-01194: file 1 needs more recovery to be consistent**  ORA-01110: data file 1: '/u01/app/oracle/product/11.2.0/xe/dbs/dbs1XE.dbf'  

What should be done next? I am clueless as I am not a database guy.

Note:

Output of

$ lsnrctl services    LSNRCTL for Linux: Version 11.2.0.2.0 - Production on 04-JAN-2013 09:15:37    Copyright (c) 1991, 2011, Oracle.  All rights reserved.    Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=IPC)(KEY=EXTPROC_FOR_XE)))    Services Summary...    Service "PLSExtProc" has 1 instance(s).   Instance "PLSExtProc", status UNKNOWN, has 1 handler(s) for this service...      Handler(s):        "DEDICATED" established:0 refused:0           LOCAL SERVER  Service "XE" has 1 instance(s).    Instance "XE", status READY, has 1 handler(s) for this service...      Handler(s):        "DEDICATED" established:0 refused:0 state:ready           LOCAL SERVER  The command completed successfully  

Are regular VACUUM ANALYZE stil recommended under 9.1?

Posted: 13 Jun 2013 01:01 PM PDT

I'm using PostgreSQL 9.1 on Ubuntu. Are scheduled VACUUM ANALYZE still recommended, or is autovacuum enough to take care of all needs?

If the answer is "it depends", then:

  • I have a largish database (30 GiB compressed dump size, 200 GiB data directory)
  • I do ETL into the database, importing close to 3 million rows per week
  • The tables with the most frequent changes are all inherited from a master table, with no data in the master table (data is partitioned by week)
  • I create hourly rollups, and from there, daily, weekly and monthly reports

I'm asking because the scheduled VACUUM ANALYZE is impacting my reporting. It runs for more than 5 hours, and I've had to kill it twice this week, because it was impacting regular database imports. check_postgres doesn't report any significant bloat on the database, so that's not really an issue.

From the docs, autovacuum should take care of transaction ID wrap around as well. The question stands: do I still need a VACUUM ANALYZE?

[SQL Server] UNION with multiple CTEs and summing data

[SQL Server] UNION with multiple CTEs and summing data


UNION with multiple CTEs and summing data

Posted: 13 Jun 2013 12:48 PM PDT

I've been trying to get this to work for days. I have two queries with the results I want. I thought I could UNION the data and combine the two but I'm struggling. Here's code for essentially what I'm doing. It's actually in Oracle but I tested the code below in SQL Server and get the same result.[code="sql"]CREATE TABLE STG.GasStmt(PLANT_NO varchar(100),ALLOC_WHDV_VOL numeric(29, 5),KW_CTR_REDELIVERED_HV numeric(29, 5),MTR_NO varchar(100),MTR_SFX varchar(100),TRNX_ID bigint,REC_STATUS_CD varchar(100),ACCT_DT DateTime)[/code][code="sql"]insert into STG.GasStmtselect '043','0','50','36563','','83062200','OR','12/1/2011' union allselect '002','0','100','36563','','83062222','OR','12/1/2011' union allselect '002','0','-.99','36563','','-83062299','RR','12/1/2011' union allselect '002','0','-.99','36563','','-83062299','RR','2/1/2013' union allselect '002','0','-.99','36563','','-83062299','RR','4/1/2013' union allselect '002','0','-.99','36563','','83062299','OR','2/1/2011' union allselect '002','0','-.99','36563','','-86768195','RR','12/1/2011' union allselect '002','0','-.99','36563','','-86768195','RR','2/1/2013' union allselect '002','0','-.99','36563','','-86768195','RR','4/1/2013' union allselect '002','0','-.99','36563','','86768195','OR','3/1/2011' union allselect '002','0','-.99','36563','','-90467786','RR','1/1/2012' union allselect '002','0','-.99','36563','','-90467786','RR','2/1/2013' union allselect '002','0','-.99','36563','','-90467786','RR','4/1/2013' union allselect '002','0','-.99','36563','','90467786','OR','4/1/2011' union allselect '002','0','-.99','36563','','-77671301','RR','2/1/2013' union allselect '002','0','-.99','36563','','-77671301','RR','4/1/2013' union allselect '002','0','-.99','36563','','77671301','OR','1/1/2011' union allselect '002','0','-.99','36563','','-68420423','RR','2/1/2013' union allselect '002','0','-.99','36563','','68420423','OR','4/1/2013' union allselect '002','0','-.99','36563','','-188808446','RR','3/1/2013' union allselect '002','0','-.99','36563','','188808446','OR','1/1/2013' union allselect '002','1205.15','0','36563','A','138365544','OR','2/1/2012' [/code][code="sql"]WITH RemoveData AS ( SELECT a.PLANT_NO,a.ALLOC_WHDV_VOL,a.KW_CTR_REDELIVERED_HV, a.MTR_NO, a.MTR_SFX, a.TRNX_ID, a.REC_STATUS_CD, MAX(a.ACCT_DT) ACCT_DT FROM STG.GasStmt a WHERE a.REC_STATUS_CD = 'RR' GROUP BY a.PLANT_NO,a.ALLOC_WHDV_VOL,a.KW_CTR_REDELIVERED_HV, a.MTR_NO, a.MTR_SFX, a.TRNX_ID, a.REC_STATUS_CD HAVING COUNT(a.REC_STATUS_CD) > 2 ), RemoveData2 AS ( SELECT plant_no "PlantNumber" ,SUM(-a.ALLOC_WHDV_VOL) "PlantStandardGrossWellheadMcf" ,SUM(KW_CTR_REDELIVERED_HV) "KeepWholeResidueMMBtu" FROM RemoveData a GROUP BY plant_no ), OriginalData AS ( SELECT a.PLANT_NO "PlantNumber" ,SUM(a.ALLOC_WHDV_VOL) "PlantStandardGrossWellheadMcf" ,SUM(CASE WHEN a.REC_STATUS_CD = 'RR' THEN -a.KW_CTR_REDELIVERED_HV ELSE a.KW_CTR_REDELIVERED_HV END) "KeepWholeResidueMMBtu" FROM STG.GasStmt a LEFT OUTER JOIN (SELECT MTR_NO, MTR_SFX, TRNX_ID, REC_STATUS_CD, MAX(ACCT_DT) ACCT_DT FROM STG.GasStmt WHERE REC_STATUS_CD = 'RR' GROUP BY MTR_NO, MTR_SFX, TRNX_ID, REC_STATUS_CD HAVING COUNT(TRNX_ID) > 1) b ON a.MTR_NO = b.MTR_NO AND a.TRNX_ID = b.TRNX_ID AND a.Rec_Status_Cd = b.REC_STATUS_CD AND a.Acct_Dt = b.ACCT_DT WHERE a.ACCT_DT > '1/1/2010' AND b.MTR_NO IS NULL GROUP BY a.PLANT_NO ) SELECT *FROM RemoveData2UNION SELECT *FROM OriginalData[/code]Sorry, I went overboard with the inserts. I wanted to make sure it was like my data. The result I'm hoping for with the above query is PlantNumber 002 combined.I'm getting:PlantNumber | PlantStandardGrossWellheadMcf | KeepWholeResidueMMBtu002 | 0.00000 |-2.97000002 | 1205.15000 |102.97000043 |0.00000 |50.00000My intended result:PlantNumber | PlantStandardGrossWellheadMcf | KeepWholeResidueMMBtu002 | 1205.15000 |100043 |0.00000 |50.00000Is this possible? Can I combine and add the rows by plantnumber? Sorry for the very long post, I'm desperate!

Database Diagrams Help

Posted: 13 Jun 2013 09:29 AM PDT

Had a question about database diagrams. We want to set up a diagram to mainly get the benefit of visualizing our data, and establishing the relationships. The tables we have are old and do not always have a simple key. The diagram seems like a good fit, but does anyone know if we can set it up so that the relationships set up in the diagram do not enforce any rules or effect the tables the reference in any way? We don't want to effect the tables, only get a layout of the data we have. Is there a better tool to use outside of Management Studio 2008 R2 to set up these relationships for visualization only?Any help would be appreciated!

Search for relationship

Posted: 13 Jun 2013 06:05 AM PDT

I have a table having company names and another table having Revenueid and amount of Revenue but dont seem t get the relationship where it says which company pays what amount oof revenue.How can I see that. No common columns in these two tables.There are lot of other tables in the DB but I dont seem to find one which can show the relationship and get me the data.Pls suugest.

inserting a zip file into a varbinary column in a database

Posted: 13 Jun 2013 04:46 AM PDT

anyone know how to write an insert statement that will allow you to write a zip file into a table column designated as type varbinary? I would like to write and insert state and a retrieve statement.Here is my attempt at writing a zip file:Table: mps _datacolumns: mps_id longmps_session_id varchar(128)mps_request varbinarylocal variablesll_mps_id long value 1ls_session string value 'userid' + datetimels_zipfile string location of zipfile (ie c:\temp\zipfile.zip)insert into mps_data (mps_id, mps_session_id, mps_request) values ( :ll_mps_id, :ls_session, CONVERT(varbinary(max), :ls_zipfile));writes a binary value that is illegible. When I try to retrieve it creates a blank zip file that cannot be opened. Has anyone else been able to write a simple insert and retrieval process for zip files?

convert into second

Posted: 13 Jun 2013 02:58 AM PDT

Can you please help me converting datediff into seconds. I'm using SQL 2008 , the following command error out with pass valid argument. round((ENDDTTM - BEGINDTTM) * 24 * 60 * 60) || ' seconds' as PROCSTIME . Thank you

Move Primary Key Nonclustered Constraint and Clustered Index to New Filegroup?

Posted: 13 Jun 2013 01:05 AM PDT

New to SQL and want to try my hand at moving tables/indexes to different filegroups. Can someone tell me the correct process for migrating the nonclustered primary key index and clustered index? I included the table script below. CREATE TABLE [dbo].[system_xwalk_user_roles]( [RecID] [int] IDENTITY(1,1) NOT NULL, [UserRecID] [int] NOT NULL, [RoleRecID] [int] NOT NULL, [IsSuspended] [bit] NOT NULL, CONSTRAINT [PK_system_xwalk_user_roles] PRIMARY KEY NONCLUSTERED ( [RecID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 90) ON [PRIMARY]) ON [PRIMARY]GOCREATE CLUSTERED INDEX [IX_system_xwalk_user_roles] ON [dbo].[system_xwalk_user_roles] ( [UserRecID] ASC, [RoleRecID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 90) ON [PRIMARY]GO/****** Object: Default [DF_system_xwalk_user_roles_IsSuspended] Script Date: 06/13/2013 09:32:06 ******/ALTER TABLE [dbo].[system_xwalk_user_roles] ADD CONSTRAINT [DF_system_xwalk_user_roles_IsSuspended] DEFAULT (0) FOR [IsSuspended]GO

Search This Blog