Saturday, June 1, 2013

[SQL Server 2008 issues] calculate Excess/Short

[SQL Server 2008 issues] calculate Excess/Short


calculate Excess/Short

Posted: 31 May 2013 05:56 PM PDT

i have a problem kindly help me out.this type of data i want:Date------------Time in----Time out-----Spent time-------excess/short2013-01-01-------09:14-----19:06-----------09:52-------------00:522013-01-02-------09:52-----18:36-----------08:44-------------00:162013-01-03-------09:15-----18:56-----------09:41-------------00:41excess short comes from spent time mean a employee spent 9 hours 52 minute then its excess 00:52 minutes and if employee spent 8 hours 44 minutes then its short 00:16 minutes we have 9 to 6 hours timing shift.

How can we export Database Diagram into image/picture format?

Posted: 22 Apr 2010 12:07 AM PDT

I have created database diargram successfully. Now I want to export created database diagram into jpeg file or any suitable format that should allow user to view correctly.Can anyone tell me how to do this?Thanks in AdvanceVinayak.

Selecting a Value in a column based on value in another column.

Posted: 31 May 2013 04:33 PM PDT

Hi Experts,I have 2 columns as below --------------------------------taxid tax_Combination--------------------------------Bed+Vat ! VatBed+Vat ! CessBed+Vat ! BEDVAT ! VatCST ! CSTfrom a query i will get the Combinations like "Vat,Cess,BED", now i need to write a query where i have to pass these combinations and get the Taxid = 'Bed+Vat' alone.but when i us "IN" it returns 'Bed+Vat' and 'Vat' also.but i need to get the taxid for which the combination is exactly what i have given so it should be 'Bed+Vat' alone please help me.Thank you.

literature recommendations please

Posted: 31 May 2013 09:27 AM PDT

Can anyone recommend a good, complete tutorial for a newbie on how to install SQL Server 2008 Express with Advanced Services, including SQL Full-test Filter Daemon Launcher?If it will help, please see my thread "SQL Server can't connect to database" athttp://www.sqlservercentral.com/Forums/Topic1455724-391-1.aspxAnd while we're on the subject, can anyone recommend a good, complete textbook on SQL Server 2008?Thanks for any help anyone can give.

Patching runs DBCC checkdb

Posted: 31 May 2013 08:07 AM PDT

Hi Team,I have a cluster environment and I am going to apply the patches in the passive first, reboot passive and failover from the resources from the active to passive. When i do the failover, does SQL Server run DBCC Checkdb on the passive node before bringing the databases online?Thanks,Karthik R

Case Statement in Where Clause

Posted: 31 May 2013 08:15 AM PDT

I have a table with column A. Column A has values 1 or 0. The values represent 2 different outcomes. 1 is cats and 0 is dogs. I want to be able to use the column in a where clause as part of a stored procedure. The parameters are @cats and @dogs. The user can select either @cats or @dogs, or they can select both. If the don't make a selection I want to return both.If (@cats is 1 and @dogs is 0) then 'all cats' if (@cats is 0 and @dogs is 1) then 'all dogs' else 'all cats and all dogs'

SSIS - What is the best practice for creating a fixed length flat file.

Posted: 29 May 2013 01:36 AM PDT

I am running into the problem of SSIS putting the CRLF in random places at the end of the row. If we assume I am going to start from scratch, can I fix this issue in the Stored Procedure I use in my OLE source, or can I fix it on the SSIS side?I did see the following at MSDN but it did not make any sense to me since I don't see where to do this in the connection manager.[i][b]To make this work so that each row is exactly the same length, you may need to add a zero length column as the last one in the connection manager, and set the row delimiter on that one.[/b][/i]I did read as possibly using the advanced editor on the OLE adapter and adding a column there but how do I create a zero length column. If I send the header in the first row it is going to be at least 1 character.Any help would be greatly appreciated.FYI, I am using ragged right.

need help on sql script

Posted: 31 May 2013 06:03 AM PDT

HiI have done the script below but when I execute it on sql server 2008 an error comes up. I'm having hard time to figure it out. Please any help would be very appreciate. Thank youHere is the script:use mastergocreate database SouthCommunityCollegegouse SouthCommunityCollegegocreate table Semester(semesterID int primary key, semesterName varchar (30))create table Section(sectionID int primary key, sectionName varchar (30))create table Course(courseID int primary key, sectionID int, semesterID int,courseTitle varchar (30), courseCredits varchar (30),foreign key (semesterID) references Semester (semesterID),foreign key (sectionID) references Section (sectionID))create table Student(studentID int primary key, studentName varchar (30),studentAddress varchar (50), studentPhone int,studentEmail varchar (20))create table StudentAccount(studentID int, sectionID int,courseID int, instructorID int, grades varchar (30),constraint pk_StudentAccount primary key (studentID, sectionID),foreign key (courseID) references Course (courseID),foreign key (instructorID) references Instructor (instructorID))create table Instructor(instructorID int primary key, instructorName varchar (30),instructorPhone int, instructorOffice varchar (30), instructorEmail varchar (20))create table InstructorCourse(instructorID int primary key, courseID int)Here is the errorMsg 1767, Level 16, State 0, Line 19Foreign key 'FK__StudentAc__instr__117F9D94' references invalid table 'Instructor'.Msg 1750, Level 16, State 0, Line 19Could not create constraint. See previous errors.

Error passing parameter to SSIS package from Stored Proc

Posted: 31 May 2013 02:33 AM PDT

I am getting the error "Argument ""\Package\DataFlowTask.Variables[User::SvcDate].Properties[Value];"" for option "set" is not valid." when i attempt to pass a parameter to a variable within a SSIS package. The variable SvcDate is declared as datatype string within the package. The scope of the variable is defined at the Data Flow Task level. When I execute the package without the /SET parameter and set a default value for the variable within the package, it runs fine.I am new to SSIS and any help much appreciated!Thanks,BrianHere is my code:DECLARE @SQLQuery AS VARCHAR(2000),@SvcDate as varchar(10)DECLARE @ServerName Varchar(200) SET @SvcDate = '2120430'SET @SQLQuery = 'DTExec "/DECRYPT TBRIAN2" /FILE ^"C:\MCG\SSIS\Medmas Import.dtsx^" /SET \Package\DataFlowTask.Variables[User::SvcDate].Properties[Value]; '''+@SvcDate+''' ';EXEC master..xp_cmdshell @SQLQuery

Do you know of any Enterprise Database Inventory Management Systems?

Posted: 29 May 2013 05:07 AM PDT

Hello all,For the past half a decade, I as well as former and current colleagues have been custom developing an enterprise solution for inventory management as it relates to the SQL Server world, pretty much as a side task. This includes but is not limited to Servers, Instances, Databases, Applications, Windows Configurations, SQL Configurations, Database Sizes, Extended Properties, SQL Job History and quite a bit more. It's pretty awesome and serves its purpose; however time is never with us and we seem to never have enough time to dedicate towards ongoing enhancements and reporting. As we all know, daily life and operational tasks always takes priority over the bigger fun stuff! :(With the above said, I'm curious if anyone knows of, or can recommend any vendor based software out there that could possibly replace/enhance what we have. We have a massive SQL Server infrastructure and there is such a wealth of data out there that we truly need a robust and scalable application with trending/predictive reporting.Would appreciate any feedback.Thanks

How we can insert a sign of flag in the record based on condition.?

Posted: 31 May 2013 03:31 AM PDT

I want to insert a sign or change the color of particular record based on result of the condition.for an example if age is more than >45 then all the records in the table which are more than 45 will have some kind of sign or their color shoul get change.Please help.thanks.

List of queries using MAXDOP

Posted: 29 May 2013 09:47 AM PDT

Gurus, currently, in our environment we configured our sql servers for MAXDOP of 4. here is our config:4 CPUS - each 8 Core - 16 logical processorsCan you please help me determine if it is possible to find which queries are utilizing MAXDOP for query processing?ThanksJagan K

Different actual execution plan for same query

Posted: 31 May 2013 01:35 AM PDT

I have a query that I execute with a recompile hint with sp_executesql. This query performs rather ok. Then I comment the recompile hint and add some text making the query text unique thus forcing recompilation of the plan. The actual execution plan looks completely different then and the query is much slower. The number of estimated rows is very low which might be the symptom of the root cause. Also the io statistics are completely different. The query is run in transaction isolation level snapshot. We have seen some high read/write stalls on the tempdb file but I cannot imagine any problems on the tempdb causing other execution plans. Also all the statistics on the database were updated 8 hours ago. We see this phenomenon on and off. Sometimes the query is fast and sometimes slow. Recompile sometimes gives a boost but not always. Parameter sniffing was a first candidate explanation but with the latest test seems unlikely. The query has a lot of parameters which is due to the way our software handles queries. What influences the execution plan other than statistics? I have attached exec plans of fast and slow for completeness but my first goal is to have some lead for follow up research as we are

How to compare dates in 2 rows based on value in other column

Posted: 31 May 2013 12:37 AM PDT

I have a table with studentId, decision and FinishdateWhere studentid is the primary key and there could be none decision earlier and the aacept/deny later or vice-versa for a student.So, the decision could be accept/deny or None when there is no decision.I have to select the row of data when there is a decision (that is where decision is other than 'None' )and finish date is most recent for a studentid.How do I do that?Thanks, blyzzard

locked out of local SQL Server

Posted: 31 May 2013 01:02 AM PDT

Somehow my active directory account disappeared over the weekend. After getting the account back (same name), I could not login to my local SQL Server (on my PC). It's 2008R2, and I'm using XP.No one knows the sa password, but I normally login using Windows Authentication. Do I need to reinstall, or is there any other way to get into it?

How to Allocate the recieved amount to different expenses

Posted: 27 May 2013 05:06 PM PDT

Create Table #Temp( Number Int, Totalcoll Numeric(12,2), Maintainance_Due Numeric(12,2), Maintainance_Coll Numeric(12,2), Expense_Due Numeric(12,2), Expense_Coll Numeric(12,2))Insert Into #TempValues (1,0,500,0,400,0), (1,900,0,0,0,0), (1,0,200,0,800,0), (1,1000,0,0,0,0) Select * From #Temp /* With reference to above data, my Requirement is, For 1st row, there is Maintainance_Due = 500 and Expense_Due = 400 then In Second row,I want Under Maintainance_Coll should be 500 and Expense_Coll = 400 (This is TotalColl of 900 is allcated to Maintainance_Coll since Maintainance_Due = 500 and Expense_Due = 400) Same as this in 3rd row there is Maintainance_Due = 200 and Expense_Due = 800 So,In fourth row,I want Under Maintainance_Coll should be 200 and Expense_Coll = 800 (This is TotalColl of 100 is allcated to Maintainance_Coll since Maintainance_Due = 200 and Expense_Due = 800) */Please Help in this rewuirement as I required this in urgent and I am stucked in this.Please.........Thanks in Advance!!!!!!!!!!!!!!

Quick Replication question

Posted: 31 May 2013 01:03 AM PDT

Hey guys,Quick question: Will running the distribution job for snapshot replication lock the Publisher tables?Details: I have 2 SQL Server 2008 servers using Snapshot replication...I set this up 2 days ago and it has failed the past 2 nights...Last night when it failed, it completed the snapshot but the distribution job failed due to having a sorted timestamp column in one of the tables...I did some research and found a work around for that error that includes adding –UseInprocLoader at the end of the bcp command... So my question is: I know that I cant run the snapshot agent during peak hours due to putting a table lock on all the tables, but since the snapshot job completed, I was wondering if I can add that command and re run the distribution agent during peak hours? Not worried about it locking at the Subscriber level, but absolutely cannot run it if it will put a lock on at the Publisher level!Thank you in advance!

STUFF (concatenate rows into column)

Posted: 30 May 2013 09:07 PM PDT

Hey,I've used STUFF successfully before but I can't get it quite right this time as I'm using 9 columns.Take a resultset of;Col1 | Col2 | Col3 | Col4 | Col5 | Col6 | Col7 | Col8 | Col9ABC | DEF | GHI | JKL | MNO | PQR | STU | VWX | YZABC | DEF | GHI | JKL | 123 | PQR | STU | VWX | YZSo both rows are the same except Col5. What I want is this;Col1 | Col2 | Col3 | Col4 | Col5 | Col6 | Col7 | Col8 | Col9ABC | DEF | GHI | JKL | MNO; 123 | PQR | STU | VWX | YZObviously the STUFF needs to be ignored if there aren't multiple OWNERS.Col5 relates to the OWNER line (N_O columns) here;[code="sql"]SELECT DISTINCT C.IRN, CONVERT(DATE, DD.EVENTDUEDATE), ISNULL(DD_EC.EVENTDESCRIPTION, DD_EV.EVENTDESCRIPTION) + CASE WHEN DD.EVENTTEXT IS NOT NULL THEN ' (' + DD.EVENTTEXT + ')' ELSE '' END, ISNULL(dbo.fn_FormatName(N_I.NAME, N_I.FIRSTNAME, N_I.TITLE, NULL), 'UNKNOWN INSTRUCTOR'), ISNULL(dbo.fn_FormatName(N_O.NAME, N_O.FIRSTNAME, N_O.TITLE, NULL), 'UNKNOWN OWNER'), ISNULL(C.CURRENTOFFICIALNO, 'UNKNOWN'), DDI.IMPORTANCEDESC, ISNULL(N_E.SEARCHKEY2, '0'), ISNULL(N_S.SEARCHKEY2, '0')FROM CASES C WITH (NOLOCK) JOIN CASEEVENT DD WITH (NOLOCK) ON (DD.CASEID = C.CASEID) JOIN OPENACTION DD_OA WITH (NOLOCK) ON (DD_OA.CASEID = C.CASEID) JOIN EVENTCONTROL DD_EC WITH (NOLOCK) ON (DD_EC.CRITERIANO = DD_OA.CRITERIANO AND DD_EC.EVENTNO = DD.EVENTNO) JOIN EVENTS DD_EV WITH (NOLOCK) ON (DD_EV.EVENTNO = DD.EVENTNO) JOIN CASENAME CN_I WITH (NOLOCK) ON (CN_I.CASEID = C.CASEID AND CN_I.NAMETYPE = 'I') JOIN NAME N_I WITH (NOLOCK) ON (N_I.NAMENO = CN_I.NAMENO) LEFT JOIN CASENAME CN_O WITH (NOLOCK) ON (CN_O.CASEID = C.CASEID AND CN_O.NAMETYPE = 'O') LEFT JOIN NAME N_O WITH (NOLOCK) ON (N_O.NAMENO = CN_O.NAMENO) LEFT JOIN CASENAME CN_E WITH (NOLOCK) ON (CN_E.CASEID = C.CASEID AND CN_E.NAMETYPE = 'EMP') LEFT JOIN NAME N_E WITH (NOLOCK) ON (N_E.NAMENO = CN_E.NAMENO) LEFT JOIN CASENAME CN_S WITH (NOLOCK) ON (CN_S.CASEID = C.CASEID AND CN_S.NAMETYPE = 'SIG') LEFT JOIN NAME N_S WITH (NOLOCK) ON (N_S.NAMENO = CN_S.NAMENO) JOIN IMPORTANCE DDI ON DDI.IMPORTANCELEVEL = COALESCE(DD_EC.IMPORTANCELEVEL, DD_EV.IMPORTANCELEVEL, 9)WHERE (DD_OA.ACTION = DD_EV.CONTROLLINGACTION OR (DD_EV.CONTROLLINGACTION IS NULL AND DD_EC.CRITERIANO = ISNULL(DD.CREATEDBYCRITERIA, DD_OA.CRITERIANO)) OR DD_OA.CASEID IS NULL) AND ISNULL(DD.OCCURREDFLAG, 0) = 0 AND DD.EVENTDUEDATE IS NOT NULL AND C.IRN = 'P42951EP' AND EXISTS (SELECT 1 FROM OPENACTION OAX WITH (NOLOCK) JOIN EVENTCONTROL ECX WITH (NOLOCK) ON (ECX.CRITERIANO = OAX.CRITERIANO AND ECX.EVENTNO = DD.EVENTNO) JOIN ACTIONS AX WITH (NOLOCK) ON (AX.ACTION = OAX.ACTION) WHERE OAX.CASEID = C.CASEID AND OAX.ACTION = CASE WHEN (DD.EVENTNO = -11) THEN 'RN' ELSE ISNULL(DD_EV.CONTROLLINGACTION, OAX.ACTION) END AND OAX.POLICEEVENTS = 1 AND OAX.ACTION <> '~2' AND OAX.CYCLE = CASE WHEN (AX.NUMCYCLESALLOWED > 1) THEN DD.CYCLE ELSE 1 END AND ISNULL(AX.ACTIONTYPEFLAG, 0) <> 1) AND (DD.EVENTDUEDATE < DATEADD(MONTH, 1, GETDATE()));[/code]

Migrating An SSRS Database from one server to another, trying to remove jobs from old server

Posted: 31 May 2013 12:49 AM PDT

So our reporting database has been on a test server for way too long. I backed it up, and restored it to a production server. I also backed up and restored the ReportServer and ReportServerTempDB databases, as I was trying to get all of my SSRS reports and subscriptions moved over as well. Under Sql Server Agent, I have many jobs that are just named with a guid, and then I have my simple named jobs (SSIS jobs), like Daily, Weekly, etc. It appears that the ones named with a guid are mainly from the test server SSRS installation, and they appear to be running. I decided that I wanted to remove them all, and add my subscriptions locally. In Management Studio, I right clicked on the jobs, and deleted them. This morning, all of the jobs are back! The guids match the ones that exist on the test server. Are there some simple steps I can follow to either remove these jobs, or is there an easy way to initialize the report server, so that I can then re-create the jobs and subscriptions I need?Please keep in mind that I'm actually a developer, not a dba, even though I'm the closest thing to a dba that we have! :-D

issues with setting up Publisher in SQL2008

Posted: 30 May 2013 11:51 PM PDT

I have a two server configuration where the distribution db is on another server and the publisher is on a second server. I have setup the distribution db just fine (SQL2008R2 SP2) and when I configure the Publisher on SQL2008 SP3 it works but when the replication starts it gets a failed login.....The login is right and the password is correct. I don't understand why this thinks the password is incorrect. In the Publication I chose the option of RUn under SQL Server Agent Service Account so I didn't have to supply the password.... MessageReplication-Replication Snapshot Subsystem: agent failed. The replication agent had encountered an exception.Source: ReplicationException Type: Microsoft.SqlServer.Replication.ConnectionFailureExceptionException Message: Login failed for userMessage Code: 18456Replication-Replication Snapshot Subsystem: agent failed. Unable to start execution of step 2 (reason: Error authenticating proxy system error: Logon failure: unknown user name or bad password.). The step failed.

Can this be sorted without temporary tables

Posted: 30 May 2013 08:51 PM PDT

Two tables - one containing (team) Leaders and one containing (team) members[code="sql"]CREATE TABLE #tblLeaders ( LeaderID int, Leader varchar(50) );INSERT INTO #tblLeaders (LeaderID, Leader)SELECT 1, 'Pete' UNION ALLSELECT 2, 'Jim' UNION ALLSELECT 3, 'Arthur'CREATE TABLE #tblMembers ( MemberID int, LeaderID int, Member varchar(50) ); INSERT INTO #tblMembers (MemberID, LeaderID, Member)SELECT 1, 1, 'Frank' UNION ALLSELECT 2, 1, 'Martha' UNION ALLSELECT 3, 2, 'Betty' UNION ALLSELECT 4, 2, 'Mary' UNION ALLSELECT 5, 3, 'Jerry' UNION ALLSELECT 6, 3, 'Marcia'SELECT * FROM #tblLeadersSELECT * FROM #tblMembersSELECT *FROM #tblLeadersINNER JOIN #tblMembers on #tblLeaders.LeaderID = #tblMembers.LeaderIDORDER BY Leader, MemberDROP TABLE #tblLeadersDROP TABLE #tblMembers[/code]The code above gives me:Leader | MemberArthur - JerryArthur - MarciaJim - BettyJim - MaryPete - FrankPete - MarthaWhich is the Leaders in alphabetical order with their Members in alphabetical orderBut, what I need is data that comes back like this:ArthurJerryMarciaJimBettyMaryPeteFrankMartha... which is - a row for each team leader, then their team members but still with the Team Leaders appearing alphabetically and with their Team Members appearing below them but sorted alphabetically.I can do this by putting the team leaders in a cursor, looping through it, putting them in a temp table, finding their team members and putting them in the temp table etc.Is there a cursor free way of doing this? Thanks for any help.

Linked Server error on Mirror DB

Posted: 30 May 2013 10:44 PM PDT

Hi all - we are executing some code below to check if the instance is the principle and then run some code if it is.[code="sql"]IF exists(SELECT TOP 1 * FROM [BENTLEY].MASTER.sys.DATABASE_MIRRORINGWHERE mirroring_role_desc = 'PRINCIPAL') BEGIN insert into [sql1].[web].[dbo].test (id,object,compid,type,createdtime,updatedtime,requeststatus) SELECT * FROM [BENTLEY].[zane].[dbo].[test] WHERE requestStatus = 1 ENDELSE begin insert into [sql1].[web].[dbo].test (id,object,compid,type,createdtime,updatedtime,requeststatus) SELECT * FROM [FERRARI].[zane].[dbo].[test] WHERE requestStatus = 1 end[/code]However we always get the error below even though BENTLEY is the PRINCIPLE. The code above looks to be trying to validate the FERRARI linked server before the code even runs. Any ideas how to get this to work?ThanksOLE DB provider "SQLNCLI10" for linked server "FERRARI" returned message "Unspecified error".Msg 954, Level 14, State 1, Line 1The database "Zane" cannot be opened. It is acting as a mirror database.

Sql 2008 to 2008 R2 in-place upgrade failed and now have an "inactive" instance?

Posted: 31 May 2013 12:20 AM PDT

I attempted an upgrade yesterday which failed due to a ReportingServices error. I uninstalled ReportingServices since it was not being used and ran the R2 setup again but it says no features available to upgrade. The product discovery output is showing inactive instances. I am not sure how to uninstall this orphaned instance as I don't want to mistakenly remove the working instance.Has anyone experienced this and know how to properly get the upgrade running again?Product Instance Instance ID Feature Sql Server 2008 MSSQLSERVER MSSQL10.MSSQLSERVER Database Engine Services Sql Server 2008 MSSQLSERVER MSSQL10.MSSQLSERVER SQL Server Replication Sql Server 2008 MSSQLSERVER MSSQL10.MSSQLSERVER Full-Text Search Sql Server 2008 MSSQLSERVER MSAS10.MSSQLSERVER Analysis Services Sql Server 2008 MSSQLSERVER MSRS10.MSSQLSERVER Reporting Services Sql Server 2008 R2 MSSQLSERVER.INACTIVE MSSQL10_50.MSSQLSERVER Database Engine Services Sql Server 2008 R2 MSSQLSERVER.INACTIVE MSSQL10_50.MSSQLSERVER SQL Server Replication Sql Server 2008 R2 MSSQLSERVER.INACTIVE MSSQL10_50.MSSQLSERVER Full-Text Search Sql Server 2008 R2 MSSQLSERVER.INACTIVE MSAS10_50.MSSQLSERVER Analysis Services Sql Server 2008 R2 Management Tools - Basic Sql Server 2008 R2 Management Tools - Complete Sql Server 2008 R2 Client Tools Connectivity Sql Server 2008 R2 Client Tools Backwards Compatibility Sql Server 2008 R2 Client Tools SDK Sql Server 2008 R2 Integration Services Programs Installed:Microsoft SQL Server 2005 Backward compatibility Microsoft SQL Server 2008 (64-bit)Microsoft SQL Server 2008 R2 (64-bit)Microsoft SQL Server 2008 R2 Books OnlineMicrosoft SQL Server 2008 R2 Native ClientMicrosoft SQL Server 2008 R2 PoliciesMicrosoft SQL Server 2008 R2 Setup (English)Microsoft SQL Server 2008 Setup Support Files

SA account question

Posted: 30 May 2013 09:47 PM PDT

HiI have been tasked with 'the removal of the sa account in any environment'I have ran sp_blitz and it has come back with some helpful info.The results show a number of Dev's who currently have sysadmin level privilages and other logins such as 'replicationmaster', 'sqlserveragent','mssqlserver' with sysadmin. If i disable the sa account this will not effect these logins will it ? Therefore what is the difference between the actual sa account and logins that are assigned into the sysadmins - which is the same as the sa right ?

Refresh Test Environment

Posted: 30 May 2013 09:47 PM PDT

We have to refresh (restore) a number of databases at a time from our Production System to our Test Environment and I do not want to use either SSIS or the Maintencance Plan. I looked on the web for suitable scripts , but could not find any . Any ideas ? I don't want to write the restore script for either every database or have to change the backup file names manually.Regards.Lian

Greater/Less than on text column

Posted: 29 May 2013 01:48 AM PDT

Hey,I have a column (INVOICENO) which is nvarchar(12). It's this datatype as an invoice could be a credit note, so xxCN, or debit note so xxDN.All other values are just integers, starting a 1. How would I go about showing only say invoices 10-20, including anything CN, DN etc.1011DN121314151617CN18DN1920etc... not my application (3rd party) so cannot change datatype.Not hopeful, but thanks!

Linked server on SQL server 2008R2 connecting to SQL 2005 stopped working

Posted: 29 May 2013 10:42 PM PDT

Hi Got a linked server on SQL Server 2008R2 connecting to an SQL server 2005 that was working fine but now it stopped working.Both instances are clustered.Test of linked server gives this reply:TITLE: Microsoft SQL Server Management Studio ------------------------------ The test connection to the linked server failed. ------------------------------ ADDITIONAL INFORMATION: An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo) ------------------------------ TCP Provider: An existing connection was forcibly closed by the remote host. Client unable to establish connection because an error was encountered during handshakes before login. Common causes include client attempting to connect to an unsupported version of SQL Server, server too busy to accept new connections or a resource limitation (memory or maximum allowed connections) on the server. OLE DB provider "SQLNCLI10" for linked server "Sxxxxx18\Sxxxxx18" returned message "Client unable to establish connection due to prelogin failure". OLE DB provider "SQLNCLI10" for linked server "Sxxxxx18\Sxxxxx18" returned message "Client unable to establish connection". (Microsoft SQL Server, Error: 10054) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=10.50.2500&EvtSrc=MSSQLServer&EvtID=10054&LinkId=20476 Linked server type is "sql server" and sql login for the linked server are verified ok.(connected SSMS)We have tried different scenarios to locate the problem but have not manage to do that, other than problem seems to be located at Sxxxxx18\Sxxxxx18Below test works fineConnecting by SSMS from physical box Syyyyy32 (Windows 2008), (Sxxxxx32\Sxxxxx32) 10.50.2500 to Sxxxxx18\Sxxxxx18 works fine.Create a linked server on Sxxxxx32\Sxxxxx32 to Sxxxxx17\Sxxxxx17 (9.0.5000) (physical node Syyyyy21 (Windows 2003), same physical node as for Sxxxxx18\Sxxxxx18) works fine.Create a linked server on Sxxxxx18\Sxxxxx18 (9.0.5000) to Sxxxxx32\Sxxxxx32 (1050.2500) works fine (reversed way of failing linked server) A Linked server on Sxxxxx90\Sxxxxx90 (9.0.5057) to Sxxxxx18\Sxxxxx18 (9.9.5000) works fineTest below are throwing same error as well:Trying to create new Linked server on Sxxxxx32\Sxxxxx32 (10.50.2500) to Sxxxxx18\Sxxxxx18 throw same errormessage as existing linked server when tested. Trying to create a linked server from another server, with version 10.50.4000 (Sxxxxx23) to Sxxxxx18\Sxxxxx18 (9.0.5000) throws same error.Physical node Syyyyy21 (Sxxxxx18\Sxxxxx18) has about 18 GB free memory and Sxxxxx18\Sxxxxx18 sqlsvr.exe using approx 300 MB and memorysetting for allows up to 3GBCpu is on approx 20-30%No newly applied patches that I can found.Anyone that got a clue on what it could be?

Friday, May 31, 2013

[how to] How can row estimates be improved in order to reduce chances of spills to tempdb

[how to] How can row estimates be improved in order to reduce chances of spills to tempdb


How can row estimates be improved in order to reduce chances of spills to tempdb

Posted: 31 May 2013 06:01 PM PDT

I notice that when there are spill to tempdb events (causing slow queries) that often the row estimates are way off for a particular join. I've seen spill events occur with merge and hash joins and they often increase the runtime 3x to 10x. This question concerns how to improve row estimates under the assumption that it will reduce chances of spill events.

Actual Number of rows 40k.

For this query, the plan shows bad row estimate (11.3 rows):

select Value    from Oav.ValueArray   where ObjectId = (select convert(bigint, Value) NodeId                       from Oav.ValueArray                      where PropertyId = 3331                          and ObjectId = 3540233                        and Sequence = 2)     and PropertyId = 3330  option (recompile);  

For this query, the plan shows good row estimate (56k rows):

declare @a bigint = (select convert(bigint, Value) NodeId                         from Oav.ValueArray                        where PropertyId = 3331                          and ObjectId = 3540233                          and Sequence = 2);    select Value    from Oav.ValueArray   where ObjectId = @a                    and PropertyId = 3330    -- Workpiece Side Active Objects      option (recompile);  

Can statistics or hints be added to improve the row estimates for the first case? I tried adding statistics with particular filter values (property = 3330) but either could not get the combination correct or perhaps it is being ignored because the ObjectId is unknown at compile time and it might be choosing an average over all ObjectIds.

Is there any mode where it would do the probe query first and then use that to determine the row estimates or must it fly blindly?

This particular property has many values (40k) on a few objects and zero on the vast majority. I would be happy with a hint where the max expected number of rows for a given join could be specified. This is a generally haunting problem because some parameters may be determined dynamically as part of the join or would be better placed within a view (no support for variables).

Are there any parameters that can be adjusted to minimize chance of spills to tempdb (e.g. min memory per query)? Robust plan had no effect on the estimate.

How do I model the database for an application containing canvases which are divided into building blocks?

Posted: 31 May 2013 04:40 PM PDT

How should I model my database if I wanted to have this functionality:

I want to be able to create a canvas which has many building blocks (they are something like it's sections). There are many types of canvases which can be made and each of them has a different set of blocks and different order. There are not too many types of canvases and all of them are predefined by application (users cannot edit the types, they are to be added to the database as a seed data as they are predefined templates, see example in the link given later). User can add and remove canvases.

Each building block has its own title and description which are predefined, but they also have a specific value which user can input for each of the canvases that he created.

The use case is very similar to here: https://bmfiddle.com/f/#/ , it is only that there can only be one canvas of each but in my case there should be an option to have the ability that user can add more of the same type of canvas. (But if you could show me how to model it for the use case at bmfiddle.com that could be enough as its not too different.)

Thanks a lot for even reading this question, and it would be really helpful to see an ER diagram or something similar of the possible solution for this problem.

Help my database isn't performing fast enough! 100M Merge with 6M need < 1 hour!

Posted: 31 May 2013 04:47 PM PDT

I have a server right now receiving more raw data files in 1 hour then I can upsert (insert -> merge) in an hour.

I have a table with 100M (rounded up) rows. Table is currently MyISAM. The table has 1000 columns mostly boolean and a few varchar.

Currently the fastest way i've found to get the information into my DB until now was:

Process raw data into CSV files. Load Data In File to rawData Table. Insert rawData table into Table1. (on dupe key do my function) Truncate rawData Repeat. Worked fine until im merging 6M+ Rows into 100M rows and expecting it to take under an hour.

I got 16G of ram so I set my Key_Buffer_Pool to 6G. I have my query cache pool to 16M I have my query cache limit to 10M I would just replace the information however it has to be an Upsert, Update the fields that are true if exists and insert if it does not.

Things im looking into atm; - Possibly switching server table to InnoDB? |-> Not sure about the performance, as the insert into an empty table is fine, its the merge that's slow.

Maybe allowing more table cache? Or even Query Cache? mysql sql mysqli innodb myisam

Merge Code:

b.3_InMarket = (b.3_InMarket OR r.3_InMarket),

To compare my 2 bool columns.

Update

  • Ok I set Raid0
  • Changed my query to Lock Write on tables when inserting
  • When importing csv im disabling keys then re-enabling them before upsert.
  • Changed concurrent_insert to 2

Should numeric identifiers like serial numbers be stored as integers?

Posted: 31 May 2013 04:37 PM PDT

Many of our databases store numeric tracking codes (like serial numbers) as integers. There is little chance of these codes every getting alphabetic characters, but it still seems like an incorrect data type since you would never be performing mathematical operations on them.

What is appropriate data type for these types of identifiers?

Security Concerns of SQL Server Express for Web e-Voting web application

Posted: 31 May 2013 03:47 PM PDT

I am building a database driven web application. The web application is basically an E-Voting website, where users can register and vote on things that concern their community ... etc.

The database engine in use is SQL Server Express 2008. I know that my Domain Model and Application Code does not allow any silly security breaches, e.g. a User viewing the vote of other users ... etc.

But that aside, are there things (free) I can do to make my database secure? So even if a hacker gets access to my Db, he can't do anything with it? or make it very hard for him.

I realise this question can be a little bit open ended, but some simple tips would be greatly appreciated.

financial transactions with and without commissions

Posted: 31 May 2013 12:38 PM PDT

I'm building a gambling website. All financial transactions from and to external sources (credit cards, money transfer systems) are subject to commission. Amount of the commission depends on the source selected. And moreover, it changes periodically. These external commissions go to the payment systems.

Also some inner transactions are subject to commission. Inner commissions depend on the type of the game user playing. These internal commissions go to gambling website, it is the income.

I'm looking for the best way to store the history of transactions and commissions. Should they be stored in one table TRANSACTIONS (with a connection between payment and commission) or should I have separate table COMMISSIONS.

The purpose of storing is to represent the history of transactions to the user and of course for internal bookkeeping (in order to always know how much have we paid to payment systems and how much have we earned ourselves).

Any advice greatly appreciated.

Does a SQL Server Job Run on the Server or the Local Machine?

Posted: 31 May 2013 01:05 PM PDT

Does a SQL Server Job always run on the server or will it run in the context of the local machine, similar to a DTS package run from Enterprise Manager on the user's machine?
The job in question calls a DTS package. The package succeeds when run locally; the job fails whether run manually or scheduled.

MSSQL compound query [migrated]

Posted: 31 May 2013 12:28 PM PDT

Hello and thanks for looking.

I wrote a simple application for my softball team and I am having some issues with the query I need to rewrite.

SELECT DISTINCT _StartDate, _RescheduleDate, _GameTime, _FieldNumber, _Notes  FROM            _Schedule  WHERE        (_StartDate >= DATEADD(wk, 0, { fn NOW() })) AND (_StartDate < DATEADD(wk, 1, { fn NOW() }))  

This query simply gets the upcoming weeks game schedule. What I need to do and I had forgotten, was to check also for _RescheduleDates we currently have 1 or 2 games that are rescheduled. So somehow I need to modify this to check if reschedule date has a valid date and not "TBD" or NULL.

The second problem is, our games are on Friday, on the day of the game, this changes the data on the website to next weeks game and I need it to not change until the day after the game which is Saturdays. I tried adding 1 day

(wk, 1 +1d {fn NOW() }))   

But obviously this did not work. So I would surely appreciate some help with that.

Are there design patterns/best practices in writing dynamic SQL in stored procedures?

Posted: 31 May 2013 12:15 PM PDT

I'm an experienced programmer but I'm new to SQL/databases so please bare with me :).

One of my current tasks involve editing (and refactoring, if needed) a dynamic generated SQL statement in a stored procedure. Are there any "best practices" or "design patterns" that I can follow for writing stored procedures in general or better yet, stored procedures that generate long (300 lines) dynamic sql statements?

Does Change Data Capture (CDC) work in a SQL Server 2012 AlwaysOn Failover Cluster Instances setup?

Posted: 31 May 2013 04:00 PM PDT

Does Change Data Capture (CDC) work in a SQL Server 2012 AlwaysOn Failover Cluster Instances setup?

There would be two nodes in the AlwaysOn Failover Cluster Instances setup and two Domain Controllers.

We are not using AlwaysOn Availability Groups.

Will CDC work? and will it failover?

Creating Superset and Subset table with relationship mapping

Posted: 31 May 2013 09:52 AM PDT

I have requirement in one of the project where different entities like Teacher,School,Student Store addresses in single table and maintain their relationship.

Example -School can add teacher and store their address information.

-Teacher store their other address information.

-Teacher can add the student address information.

-Student can add their other address information.

-Teacher can mark their address information to School entity.

Those which are created by the each entity for others are visible to then and owner himself. Example School enter teacher address information are their address where other school cannot view it.

The teacher view it but could not edit it. If Teacher create address which are added in the school entities are private copies of teacher. School can view it but not edit it.

Problem is how to map those tables.

Students :ID

Teachers :ID

Schools:ID

Addresses :ID |StarDate |EndDate

AnotherTable :ID

Other Fields which have relationship between each entities(e.g who added it and from whom) (private and public relationship).

Rackspace Cloud Databases: incremental backup

Posted: 31 May 2013 12:00 PM PDT

Rackspace Cloud Databases are a little bit special in that we do not have access to the bin_log file, they are not servers in the traditional sense. I do have a couple of Cloud Servers with them but I want to do incremental backups on my own and am currently looking into ways to do this specifically for Cloud Databases, so far without success.

What options do we have if we want to do incremental backups on Cloud Databases?

Note: I do not want to start a discussion about which method is best, I just want to know which methods are available for this scenario.

software to automatically create document specific glossary? [closed]

Posted: 31 May 2013 09:27 AM PDT

I'm not sure if this is the right place to ask this but I couldn't find a more appropriate one. My team needs to create documentation for the software we produce and the application area has a LOT of acronyms/terms. We'd like to maintain a single master glossary of terms/acronyms + their definitions, and we'd like to have a software tool that searches for these terms in new microsoft word docs (pdfs too would be a bonus) and for the terms/acronyms it finds it copies them out of the master glossary for inclusion as a document specific glossary. Do any of you know a software tool that can do this?

Thanks for your time.

USER_ID field in alert logs (also in V$DIAG_ALERT_EXT view)

Posted: 31 May 2013 10:02 AM PDT

Does anyone know what triggers the USER_ID field in the log.xml to be populated? The value also exists in the V$DIAG_ALERT_EXT view.

I've found by observing the logs that if a temp tablespace fills up, it will log the USER_ID of the problematic SQL statement causing the issue. But other than that, it appears that value is always NULL.

Parent-Child Tree Hierarchical ORDER

Posted: 31 May 2013 10:00 AM PDT

I have to following data in SQL Server 2008 R2. SQLFiddle

Schema:

  CREATE TABLE [dbo].[ICFilters](     [ICFilterID] [int] IDENTITY(1,1) NOT NULL,     [ParentID] [int] NOT NULL DEFAULT 0,     [FilterDesc] [varchar](50) NOT NULL,     [Active] [tinyint] NOT NULL DEFAULT 1,   CONSTRAINT [PK_ICFilters] PRIMARY KEY CLUSTERED    ( [ICFilterID] ASC ) WITH       PAD_INDEX  = OFF,      STATISTICS_NORECOMPUTE = OFF,      IGNORE_DUP_KEY = OFF,      ALLOW_ROW_LOCKS  = ON,      ALLOW_PAGE_LOCKS  = ON   ) ON [PRIMARY]  ) ON [PRIMARY]    INSERT INTO [dbo].[ICFilters]  SELECT 0,'Product Type',1  UNION ALL  SELECT 1,'ProdSubType_1',1  UNION ALL  SELECT 1,'ProdSubType_2',1  UNION ALL  SELECT 1,'ProdSubType_3',1  UNION ALL  SELECT 1,'ProdSubType_4',1  UNION ALL  SELECT 2,'PST_1.1',1  UNION ALL  SELECT 2,'PST_1.2',1  UNION ALL  SELECT 2,'PST_1.3',1  UNION ALL  SELECT 2,'PST_1.4',1  UNION ALL  SELECT 2,'PST_1.5',1  UNION ALL  SELECT 2,'PST_1.6',1  UNION ALL  SELECT 2,'PST_1.7',0  UNION ALL  SELECT 3,'PST_2.1',1  UNION ALL  SELECT 3,'PST_2.2',0  UNION ALL  SELECT 3,'PST_2.3',1  UNION ALL  SELECT 3,'PST_2.4',1  UNION ALL  SELECT 14,'PST_2.2.1',1  UNION ALL  SELECT 14,'PST_2.2.2',1  UNION ALL  SELECT 14,'PST_2.2.3',1  UNION ALL  SELECT 3,'PST_2.8',1  

Table:

  | ICFILTERID | PARENTID |    FILTERDESC | ACTIVE |  --------------------------------------------------  |          1 |        0 |  Product Type |      1 |  |          2 |        1 | ProdSubType_1 |      1 |  |          3 |        1 | ProdSubType_2 |      1 |  |          4 |        1 | ProdSubType_3 |      1 |  |          5 |        1 | ProdSubType_4 |      1 |  |          6 |        2 |       PST_1.1 |      1 |  |          7 |        2 |       PST_1.2 |      1 |  |          8 |        2 |       PST_1.3 |      1 |  |          9 |        2 |       PST_1.4 |      1 |  |         10 |        2 |       PST_1.5 |      1 |  |         11 |        2 |       PST_1.6 |      1 |  |         12 |        2 |       PST_1.7 |      0 |  |         13 |        3 |       PST_2.1 |      1 |  |         14 |        3 |       PST_2.2 |      0 |  |         15 |        3 |       PST_2.3 |      1 |  |         16 |        3 |       PST_2.4 |      1 |  |         17 |       14 |     PST_2.2.1 |      1 |  |         18 |       14 |     PST_2.2.2 |      1 |  |         19 |       14 |     PST_2.2.3 |      1 |  |         20 |        3 |       PST_2.8 |      1 |  

Every row has the ID of its parent and the root's parentid = 0. The FilterDescs are just sample descriptions so I can't try to parse those for ordering.

The Question

Is it possible to select all the rows in a tree-like manner? If so, how? When I say 'tree-like', I mean recursively select the parent followed by all of its children, then all the children of each one of those and so on. My Friends and I have tried but we have fallen short of working solutions but will keep trying. I am fairly new to sql so maybe this can be done easily and i'm just making things harder than necessary.

Example(desired) output:

  | ICFILTERID | PARENTID |    FILTERDESC | ACTIVE |  --------------------------------------------------  |          1 |        0 |  Product Type |      1 |  |          2 |        1 | ProdSubType_1 |      1 |  |          6 |        2 |       PST_1.1 |      1 |  |          7 |        2 |       PST_1.2 |      1 |  |          8 |        2 |       PST_1.3 |      1 |  |          9 |        2 |       PST_1.4 |      1 |  |         10 |        2 |       PST_1.5 |      1 |  |         11 |        2 |       PST_1.6 |      1 |  |         12 |        2 |       PST_1.7 |      0 |  |          3 |        1 | ProdSubType_2 |      1 |  |         13 |        3 |       PST_2.1 |      1 |  |         14 |        3 |       PST_2.2 |      0 |  |         17 |       14 |     PST_2.2.1 |      1 |  |         18 |       14 |     PST_2.2.2 |      1 |  |         19 |       14 |     PST_2.2.3 |      1 |  |         15 |        3 |       PST_2.3 |      1 |  |         16 |        3 |       PST_2.4 |      1 |  |         20 |        3 |       PST_2.8 |      1 |  |          4 |        1 | ProdSubType_3 |      1 |  |          5 |        1 | ProdSubType_4 |      1 |  

How to see what is cached in memory in SQL server 2008?

Posted: 31 May 2013 09:59 AM PDT

Is there a way how to find out what is cached in SQL Server 2008 R2? I have found the following nice article: http://blog.sqlauthority.com/2010/06/17/sql-server-data-pages-in-buffer-pool-data-stored-in-memory-cache . However, I would like to know how much data (e.g. in percentage and KB) are stored of each table and index. Is there some simple way how to obtain such data?

How to solve "The table ... is full" with "innodb_file_per_table"?

Posted: 31 May 2013 10:20 AM PDT

I have a MySQL database that holds a large amount of data (100-200GB - a bunch of scientific measurements). The vast majority of the data is stored in one table Sample. Now I'm creating a slave replica of the database and I wanted to take the advantages of innodb_file_per_table during the process. So I set innodb_file_per_table in my slave configuration and imported the dump of the database. To my surprise, it failed with

ERROR 1114 (HY000) at line 5602: The table 'Sample' is full

The file Sample.ibd is currently about 93GB, with more than 600GB free space available on the partition, so it's not a disk free-space issue. Neither it seems to be hitting any kind of file-system limit (I'm using ext4).

I'd be grateful for any ideas what could be the cause, or what to investigate.


Update: I'm using mysql Ver 14.14 Distrib 5.1.66, for debian-linux-gnu (x86_64).

My configuration is:

# This will be passed to all mysql clients  # It has been reported that passwords should be enclosed with ticks/quotes  # escpecially if they contain "#" chars...  # Remember to edit /etc/mysql/debian.cnf when changing the socket location.  [client]  port            = 3306  socket          = /var/run/mysqld/mysqld.sock    # Here is entries for some specific programs  # The following values assume you have at least 32M ram    # This was formally known as [safe_mysqld]. Both versions are currently parsed.  [mysqld_safe]  socket          = /var/run/mysqld/mysqld.sock  nice            = 0    [mysqld]  #  # * Basic Settings  #  user            = mysql  pid-file        = /var/run/mysqld/mysqld.pid  socket          = /var/run/mysqld/mysqld.sock  port            = 3306  basedir         = /usr  datadir         = /home/var/lib/mysql  tmpdir          = /tmp  language        = /usr/share/mysql/english  skip-external-locking  #  # Instead of skip-networking the default is now to listen only on  # localhost which is more compatible and is not less secure.  bind-address            = 127.0.0.1  #  # * Fine Tuning  #  key_buffer              = 16M  max_allowed_packet      = 16M  thread_stack            = 192K  thread_cache_size       = 8  # This replaces the startup script and checks MyISAM tables if needed  # the first time they are touched  myisam-recover         = BACKUP  #max_connections        = 100  #table_cache            = 64  #thread_concurrency     = 10  #  # * Query Cache Configuration  #  query_cache_limit       = 1M  query_cache_size        = 16M  #  # * Logging and Replication  #  # Both location gets rotated by the cronjob.  # Be aware that this log type is a performance killer.  # As of 5.1 you can enable the log at runtime!  #general_log_file        = /var/log/mysql/mysql.log  #general_log             = 1    innodb_file_per_table  

Update: Calling SELECT @@datadir; returns /home/var/lib/mysql/. Then df -h /home/var/lib/mysql/ gives 768G 31G 699G 5% /home.

Calling SHOW VARIABLES LIKE '%innodb_data_file_path%' prints | innodb_data_file_path | ibdata1:10M:autoextend |.

How can I dynamically back up all SSAS databases on a given instance?

Posted: 31 May 2013 01:11 PM PDT

I want to dynamically back up all the databases on a given SSAS instance using a SQL Agent job (which would most likely involve executing an SSIS package). It is imperative that this is a dynamic process - if users add databases or cubes, I want to set up a job one time that can automatically detect all existing SSAS metadata.

Unfortunately, I don't see anything out there that tells me how I can automatically and dynamically back up all of the databases on an SSAS instance in a clean way. By "clean", I mean:

Connection pools being reset with Error: 18056, Severity: 20, State: 46. & Perfmon Counters not showing

Posted: 31 May 2013 06:25 PM PDT

We are using SQL authentication & .net 4.0 Connection strings to connect to an Enterprise Edition 2012 SP1 SQL Server on a windows 2008r2 Enterprise Server. We use about 50 Servers split into 8 different groups different parts of a website.

Our website is using this SQL Server to log Visit tracking data and over the last few days it has spat out the following messages about the resetting connection pools.

The client was unable to reuse a session with SPID 1327, which had been reset for connection pooling. The failure ID is 46. This error may have been caused by an earlier operation failing. Check the error logs for failed operations immediately before this error message.

Errorlog reads

Error: 18056, Severity: 20, State: 46.

The client was unable to reuse a session with SPID 959, which had been reset for connection pooling. The failure ID is 46. This error may have been caused by an earlier operation failing. Check the error logs for failed operations immediately before this error message.

Login failed for user 'xxxx'. Reason: Failed to open the database 'xxxxxxxx' configured in the login object while revalidating the login on the connection. [CLIENT: 10.xx.xx.xxx]

After some digging I found this document from CSS blog

http://blogs.msdn.com/b/psssql/archive/2010/08/03/how-it-works-error-18056-the-client-was-unable-to-reuse-a-session-with-spid-which-had-been-reset-for-connection-pooling.aspx

and this one by the scholar Aaron Bertrand (I know the error number is different but the failure ID is the same with a number of the messages are identical)

http://sqlblog.com/blogs/aaron_bertrand/archive/2011/01/14/sql-server-v-next-denali-additional-states-for-error-18456.aspx

Failure ID 46, suggests that the login did not have permissions. Our logins default to the master database and the db name is specificied in the connection string.

I wanted to check the number of connection strings pools, etc and checked all the counters in Perfmon for ".Net Data Provider for SqlServer" it only gave me the option of defaultdomain9675 for the instance so I selected that assuming that is a system generated ID name for our Datacentre network. Unfortunately all of the counters are reading zero. On one of our other main servers the connection pools are hovering around 10 which is what I expected to see on a healthy server with that kind of load.

My question is 3 fold

1- Can anyone suggest why the Windows 2008 R2 Server is not showing ".Net Data Provider for SqlServer"?

2- Has anyone experienced this as I obviously believe that the login not having permissions is a red herring ?

3- If different groups of web servers have the same connection string syntax but with slightly different whitespace, would this cause the server to use another connection pool ?

Bitmask Flags with Lookup Tables Clarification

Posted: 31 May 2013 07:17 PM PDT

I've received a dataset from an outside source which contains several bitmask fields as varchars. They come in length as low as 3 and as long as 21 values long. I need to be able to run SELECT queries based on these fields using AND or OR logic.

Using a calculated field, where I just convert the bits into an integer value, I can easily find rows that match an AND query, by using a simple WHERE rowvalue = requestvalue, but the OR logic would require using bitwise & in order to find matching records.

Given that I would need to work with several of these columns and select from hundreds of millions of records, I feel that there would be a huge performance hit when doing bitwise & operations to filter my SELECT results.

I came across this answer from searching and it looked like it may fit my needs, but I need some clarification on how it is implemented.

Is this as simple as creating a lookup table that has all possible search conditions?

Example for 3 bits using (a & b) (Edit: Wrong bitwise op)

001,001  001,011  001,101  001,111  010,010  010,011  010,110  011,011  011,111  etc  

The author mentions that it's counter-intuitive initially, but I can't help but feel I'm interpreting the solution incorrectly, as this would give me a single lookup table with likely billions of rows.

Any clarifications on the answer I linked above or other suggestions that would preserve the existing database are appreciated.

Edit: A more concrete example using small data.

Four flags, HasHouse,HasCar,HasCat,HasDog, 0000 is has none, 1111 is has all.

Any number of flags, from all to none, can be flipped, and results must be filtered where selection matches all (Using exact value comparison) or at least 1 (Using bitwise &).

Adding a single calculated column for each bitmask is ok, but adding a column for each bit for more than 100 bits, coupled with how to insert/update the data is why I'm trying to find alternative solutions.

SQL Server 2012 catalog.executions to sysjobhistory - any way to join them?

Posted: 31 May 2013 01:16 PM PDT

I have exhausted my resources and can't find a foolproof way to join the ssisdb.catalog tables to the jobs that run them. Trying to write some custom sprocs to monitor my execution times and rows written from the catalog tables, and it would be greatly beneficial to be able to tie them together with the calling job.

BIT columns all "1" after a phpMyAdmin export/import

Posted: 31 May 2013 03:16 PM PDT

I have to import data from a MySQL database using phpMyAdmin because that's the only interface my hosting provider supports.

I have exported my database from my local machine using phpMyAdmin. After that I imported the script file to my host. All of data in the columns that is BIT type are changed to '1'. Hosting database version is 5.5.29.

Is it a phpMyAdmin problem, or MySQL version problem? How can I fix this?

SQLite writing a query where you select only rows nearest to the hour

Posted: 31 May 2013 03:52 PM PDT

I've got a set of data where data has been taken approximately every minute for about three month and the time has been stored as a unix timestamp. There is no regularity to the timestamp (i.e. the zero minute of the hour may not contain a reading, 00:59:55 and the next measurement could be 01:01:01) and days may be missing.

What I need is the row nearest to the hour, with the timestep rounding to the hour, as long as the nearest value is not more than 30 minutes away from the hour.

Where a matching hour could not be found it would be helpful if the query could include a time but no value.

I realise I'm asking a lot, but this would be incredibly helpful Thanks for taking the time to read this. James

BTW, The table is just PK (autoincrement),timestamp,value, sensor id(FK). I've tried this to get the data out:

SELECT strftime('%S',time, 'unixepoch'),strftime('%M',time, 'unixepoch'),strftime('%H',time, 'unixepoch'), strftime('%d',time, 'unixepoch'), strftime('%m',time, 'unixepoch'), strftime('%Y',time, 'unixepoch'), value from Timestream where idSensor=359;  

Breaking Semisynchronous Replication in MySQL 5.5

Posted: 31 May 2013 07:12 PM PDT

I've set up Semisynchronous Replication between two MySQL 5.5 servers running on Windows 7.

My application is running and updating the database of the master server and same is being updated in the slave database server.

But due to some unknown reasons sometimes, Replication breaks.

On running the command:

SHOW STATUS LIKE 'Rpl_semi_sync%';  

It gives this status:

'Rpl_semi_sync_master_no_times', '0'  'Rpl_semi_sync_master_no_tx', '0'  'Rpl_semi_sync_master_status', 'ON'     <<-------------  'Rpl_semi_sync_master_timefunc_failures', '0'  'Rpl_semi_sync_master_tx_avg_wait_time', '338846'  'Rpl_semi_sync_master_tx_wait_time', '29479685'  'Rpl_semi_sync_master_tx_waits', '87'  'Rpl_semi_sync_master_wait_pos_backtraverse', '0'  'Rpl_semi_sync_master_wait_sessions', '0'  'Rpl_semi_sync_master_yes_tx', '3106'  

Ideally, in semi synchronization, when the sync breaks the status should come as OFF since master is not able to receive any acknowledgement from the slave. Please help us in this regard.

Unable to start Oracle Database Server : Get an error as I try to start it

Posted: 31 May 2013 09:16 AM PDT

I just installed Oracle Database Express Edition 11g Release 2 for windows.It created a short cut icon on the desktop :

enter image description here

but as I click this icon I see this dialog box :

enter image description here

What is it ? How do I start my Oracle Database Server ?

Postgres: Can the archive command for master and standby point to the same directory?

Posted: 31 May 2013 02:15 PM PDT

I use streaming replication and PITR (WAL files) in my cluster, currently I have different versions of postgresql.conf for the master and the standby server.

The only difference in the files is the archive command, that points to a different directory.

for example, on master I have:

archive_command = 'cp %p /dbckp/server1master/%f'  

And on the standby:

archive_command = 'cp %p /dbckp/server2master/%f'  

I figure that if the standby is not generating any archives while on 'standby mode' I could use the same directory and have a single postgresql.conf?

SQL Server replication subscriptions marked as inactive

Posted: 31 May 2013 10:16 AM PDT

Is there any way to force SQL Server NOT to mark subscriptions as inactive, ever?

It happens sporadically when there are connection issues and I don't want to have to reinitialize the subscription every time.

Note, I'm not talking about the subscriptions being marked as expired...just as inactive.

Thank you.

How do I configure multi master replication on WAMP?

Posted: 31 May 2013 09:37 AM PDT

I want to configure mysql multi master replication, but I've not seen any comprehensive article or tutorial on how to do that on WAMP. I will be grateful if some will point me to a tutorial, or better still show me how to achieve that.

Initializing Transactional Replication From Backup

Posted: 31 May 2013 08:17 PM PDT

There is an option when setting up a publication for replication to Allow Initialization From Backup. We have been creating replicated databases for several years now and have always initialized from a backkup but have never set this flag (we just noticed it for the first time a couple of days ago). The replication certainly has worked without issue all this time.

I've found a lot of hits explaining that this needs to be used but none that explain why.

Does anyone have an understanding of what this actually does? From my vantage, it doesn't seem to be necessary but I'm thinking I have to be missing something.

[SQL Server] importing an access database

[SQL Server] importing an access database


importing an access database

Posted: 31 May 2013 01:47 AM PDT

I imported a database from Access to SQL server and am trying to work with the tables. The primary key fields in my Access DB were of type "auto number". The imported field was set to int. I found this out when performing an insert and got a message that the primary key field cannot be blank. I researched this and found out that I need to change the properties of the PK field to be an identity. I tried making the change and got an error message telling me that the change could not be saved because the table has to be dropped and recreated. How do I get around this? I'd rather not have to increment the PK manually with every insert. Secondarily, I had a bunch of yes/no fields in my access tables. When they were imported, they were set up to not allow blank and had to be set manually. I eventually figured out that they needed a default value set.Is there anything else that I need to watch out for that will make my running SQL statements a nightmare?Mike

msbi

Posted: 31 May 2013 01:23 AM PDT

currently i am undertaking msbi course..will it be a added value to my resume in which direction i should look for a job ??thank youguide me

[SQL Server] Advanced SQL Server 2008 Extended Events with Examples


Advanced SQL Server 2008 Extended Events with Examples




Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.



SQLTeam.com Articles via RSS


SQLTeam.com Weblog via RSS



.

sqlteam.com

[SQL Server] Testing with Profiler Custom Events and Database Snapshots


Testing with Profiler Custom Events and Database Snapshots




Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.



SQLTeam.com Articles via RSS


SQLTeam.com Weblog via RSS



.

sqlteam.com

Search This Blog