Friday, July 12, 2013

[SQL Server 2008 issues] Can't Reduce VLFs

[SQL Server 2008 issues] Can't Reduce VLFs


Can't Reduce VLFs

Posted: 11 Jul 2013 07:19 PM PDT

Hi guys, I have a job that tries to make sure every DB has < 50VLFs. It shrinks the log then regrows it to its original size in 1 step. I have one DB that I can't get < 300 VLFs. I think its because I can't shrink the log smaller than its original size but I'm not sure how I can get my 50 VLFs. Any ideas guys?Cheers

What is happening at END of an insert/delete?

Posted: 11 Jul 2013 08:00 AM PDT

I have been inserting large numbers of records from a "live" table into an archive table in a query window in SSMS. I'm doing this in batches and know in advance how many records will be inserted. Once the records are in the archive table, I'm deleting them from the live table. I'm dealing with roughly 50,000,000 records at a time. In another query window, I'm monitoring the live and archive tables to get a sense of where the process is based on the number of rows in the tables. I'm using:sp_spaceused 'MySchema.MyTable'What I noticed is that my sp_spaceused will return a "rows" value that indicates my insert or delete is done, however the query window where the insert or delete is occurring continues to indicate "Executing…" for several more minutes. In one example, when performing the insert, my "rows" value indicated all the rows in question were inserted in 25 minutes, however the query window continued to chug along and the final time on the insert was 58 minutes. If all the rows were in the table at the 25 minute mark, what exactly was going on under the hood from minute 26 to minute 58?? I'm not a DBA and can only think of a few things:- Is it completing writing to the transaction log? I would have thought it would be doing that all along?- Is it cleaning up from the internal "inserted" table?- Is it cleaning up tempdb?- Is it writing to disk? I would think it would have been doing that all along and the point at which I saw all the rows in the table, I would have expected the disk space to have been allocated.- Is it updating statistics? But when I run DBCC SHOW_STATISTICS ('MySchema.MyTable', PK_MyPKName) WITH STAT_HEADER, I see that the statistics are stale.Regardless of the methodology (this was just a quick, get the old data archived kind of thing) I would love to understand what's actually happening here!

How can I find the Total free memory in the windows machine where a sqlserver instances are installed?

Posted: 10 Jul 2013 11:33 PM PDT

Hello,I need to find the free memory in the machine where a SQL server 2008 r2 is installed, and I need to do it via T SQL.Are the table:sys.dm_os_sys_info orsys.dm_os_sys_memoryabout this or not? I've found out different opinion in the internet and I've been told that with this query I can only find the RAM available:select available_physical_memory_kbfrom master.sys.dm_os_sys_memoryis this true? how can I find the total space available in all unit C:/ with a query?thanks a lot.

SQLSERVER Agent is not running- SQLSERVER 2008R2

Posted: 16 Jul 2012 08:00 PM PDT

Hello:We have Windows 7 with SQLSERVER 2008R2 Express Edition 10.5 Installed. Database Engine and SQL Browser services are running perfect .The Problem is SQLSERVER Agent is not running. We have Provided the all the permissions to logged in users. Can anyone help us to resolve the problem?Thanks in advance ..!Regards,Ram

XML Formatting Using SQL

Posted: 11 Jul 2013 10:04 AM PDT

Can someone recommend how to create the following XML file with data that I already have in my database...?There will be many transaction elements...eg <transaction id-type="VENDOR-CUSTOMER" id="76" seq="2"></transaction>Here's the format...[code="xml"]<?xml version="1.0" encoding="ISO-8859-1"?><vip-customer-load created-date-time="2013-06-26 14:55" origin="ISP"> <data> <transaction id-type="VENDOR-CUSTOMER" id="4823798" seq="4"> <customer service="ISP" country="US" action="ADD"> <vendor-account-id>SOMEDATA</vendor-account-id> <customer-status>ACTIVE</customer-status> <credit-decision></credit-decision> <teamid>708</teamid> <name> <first>JOE </first> <middle></middle> <last>BLOW</last> <suffix></suffix> </name> <language>en</language> <currency>USD</currency> <ssn></ssn> <startdate>2013-03-31</startdate> <address> <street>123 Main St</street> <apt></apt> <double-locality></double-locality> <city>HANALEI</city> <state>HI</state> <country>US</country> <postal-code>84098</postal-code> </address> <phone>8009183278</phone> <alternate-phone></alternate-phone> <residential>Y</residential> <entered-by>ONLINE</entered-by> <loa-date>2013-03-31</loa-date> <rate-plan>09</rate-plan> <enrollment-method></enrollment-method> <physical-address></physical-address> <rep-verified></rep-verified> <account-number></account-number> <business-name></business-name> <reason-code></reason-code> <is-rep></is-rep> </customer> </transaction> </data> <control> <customer service="ISP" country="US"> <transaction-count>4</transaction-count> <add-count>1</add-count> <change-count>3</change-count> <delete-count>0</delete-count> <vendor-system-total-count>9082</vendor-system-total-count> <vendor-system-active-count>150</vendor-system-active-count> <vendor-system-incomplete-count>0</vendor-system-incomplete-count> </customer> </control></vip-customer-load>[/code]

conditional table selection

Posted: 11 Jul 2013 04:35 PM PDT

i have problem while selecting table_nameA B C2 ab ab2 bc bc3 ac ac4 as asthis is my table and i want to executeselect * from table_nameonly when it has more than 2 rows.if count(*) table_name 2 i dont want to print table_name.for this i tried SELECT CASE (SELECT COUNT(*) FROM table_name) WHEN 2 THEN "Min Row Cond not valid" ELSE (SELECT * FROM table_name AS abc ) ENDbut i am gettingError Code : 1241Operand should contain 1 column(s)how to solve this problemthank you

Distribution database cleanup

Posted: 11 Jul 2013 03:11 PM PDT

Hi All,I m going to configure a distribution database with retention period of 24 hours. Does the clean up job remove transactions that are older than 24 hours but not yet replication? or does the clean up job only remove transactions that are replicated?Also, please let me know how to monitor the performance of a distribution databaseThanks for your help..

A little help needed with simple Custom Function

Posted: 11 Jul 2013 02:22 AM PDT

This should be a simple task but I think I'm tripping over the syntax. I have data from a database that is fed out onto a web page (aspx). Due to reasons out of my control, the data comes to me as "dirty". 99% but not 100% of the data comes with "junk" characters appended to the end of the string. So, full name might look like "Doe, John, H *^" where it should look like "Doe, John, H".I could do this one the front end and rewrite some of the web page's logic but I'm trying to handle this on the back end. It would seem I could write a simple Function to "strip" out junk chars I see fit to remove. So, in my query, I could do something like:[i]SELECT dbo.[b]MyCustomFunction[/b](FieldName), Field2, Field3 FROM......etc...[/i] Then, if I've written the function correct, it'll just strip out the bad chars. I've done this many times before with VB.The Custom Function on SQL Server is where I'm having trouble. Here's what I have so far:CREATE FUNCTION fncStripBadChars (@strInputString varchar(150))RETURNS varchar(150)ASBEGIN [b]@strInputString = Replace((@strInputString), ", * *", "")[/b]RETURN @strInputStringENDGOIt's the BOLDED line above is where I'm having a little trouble. I'm getting an incorrect Syntax error.Any tips or pointers? See what I'm doing wrong?I should also mention that the ",* *" is one of the few strings I'm trying to replace with an empty string.

Generate dynamic script

Posted: 19 Aug 2011 04:40 AM PDT

Hello,I want to perform the below action on 150 databases in the same sql instanceALTER DATABASE [DB_NAME] SET RECOVERY SIMPLEHow to create a dynamic script for this to change the recovery to simple for 150 databases ?Thanks

Indexing small tables: what is small enough to avoid indexes altogether?

Posted: 11 Jul 2013 06:45 AM PDT

From books online:Indexing small tables may not be optimal because it can take the query optimizer longer to traverse the index searching for data than to perform a simple table scan. Therefore, indexes on small tables might never be used, but must still be maintained as data in the table changes.I was having a debate with a co-worker the other day and we were talking about a small table (heap) that had absolutely no indexes. There are 35k rows in just over 1,000 pages.My co-worker was of the opinion that this was a "small" table. This is a 90+% read lookup table (heap), and I was able to show that after creating a clustered index and a single non-clustered composite index of three key columns and a few included columns that I could now cover the query, and reduced the 1048 logical reads of a table scan to a 2 logical read index seek.So this specific example aside, and given that mileage will vary, my question is are there general rules of thumb for what is the threshold for a "small" table?I started thinking in terms of storage internals, that if a table is sized so it fits in under 8 pages (a single extent) that SQL Server would effectively read the entire table during a normal disk read, then there would be absolutely no benefit from indexes. It seems like once we cross the 8 page threshold, adding at least one non-clustered index should prove beneficial. I know now we're venturing into a bit of the theoretical realm, but I'm curious what are the technical boundaries for when we can expect to see benefits.For the purposes of this (theoretical) discussion lets leave out the "tipping point" and assume any non-clustered indexes we may create are covering, and that queries are not predicated on the clustering key. There may be other assumptions necessary for a debate, so please feel free to add.Thanks,Jared

Incorrect Syntax?

Posted: 11 Jul 2013 03:55 AM PDT

I'm trying to write an sp to query an Oracle db through openquery. The oracle portion of the query I've written works when run through the Oracle sqldeverloper, but the overall procedure returns an error when I try to run it through ssms.[code="other"]declare @rec_id intset @rec_id = 263703 -- for testing purposesdeclare @query nvarchar(4000)declare @sql nvarchar(4000)declare @linked_server nvarchar(4000)set @linked_server = 'VTRACK'SET @query = 'SELECT * FROM OPENQUERY(' + @query + ','SET @sql = '''select i.id, sys.utl_raw.cast_to_varchar2(dbms_lob.substr( i.raw_data, dbms_lob.getlength(i.raw_data), 1)) AS message from fmc_in i inner join fmc_hail_messages_gui_v m on i.id = substr(m.creator_name,instr(m.creator_name,'':'')+1,length(m.creator_name)-9)where m.id = '' + @rec_id + '')'''exec (@query+@sql)[/code]The error I get isIncorrect syntax near 'select i.id, sys.utl_raw.cast_to_varchar2(dbms_lob.substr( i.raw_data, dbms_lob.getlength(i.raw_data), 1)) AS message from fmc_'but I don't see what's wrong. Is there another way to approach this?

Adding new column to the table with 16 million rows

Posted: 11 Jul 2013 08:05 AM PDT

we have a table with 16 Million records,and also this table is replicated.we want to add a new column in to this table for some reason?what can harm?

reporting issue

Posted: 11 Jul 2013 08:11 AM PDT

Hey everyone. Just been given a report to do in work. Have studied SQL at uni and also done the first part of the Oracle training, but this one is taxing my abilities a bit. Essentially the problem is that we have an annual income figure in the database and he wants to see this broken down to a monthly income stream. some payments are made monthly others are made every 4 weeks. Any ideas to get me started on how to go about starting a report that will show such annual figures split across months?

Do i have to use cursor processing?

Posted: 11 Jul 2013 04:24 AM PDT

i have to grab this [b]filemakerID[/b] based on trainer id table1TrainerID [b]FilemakerID[/b]1078 53421138 87391810 4609and associate the filemakerID with a course:table2EntryID TrainerID CourseNums138 1138 37139 1138 120140 1138 119654 1078 150655 1078 151656 1078 38657 1078 36658 1810 37659 1810 159660 1810 111and combine course numbers(separated by comas) associated with same filemakerID and insert the data into a table3: Can this be done without putting data into temp tables or processing with a cursor to combine and separate coursenums by commas? FilemakerID CourseNums5342 36, 37, 38, 150, 1518739 37, 119, 1204609 37, 159, 111

I want to insert the count() into new table i created

Posted: 11 Jul 2013 06:09 AM PDT

I have this query running perfectly fine and the result set looks like this[quote]NumberRows | Time_stamp ----------- | ----------------------------940 | 2013-07-11 18:00:00.357[/quote]Now i want to insert these two columns values in a new table.i cant figure out how can i do this.DECLARE @dCurrentTime DATETIMEDECLARE @dCurrentTimeMinus5 DATETIME--Declare @counttotal intSET @dCurrentTime = GETDATE()SET @dCurrentTimeMinus5 = DATEADD(minute, -5, @dCurrentTime)--insert into UsersLoggedIn (NumberRows,@dCurrentTime);SELECT Count(*) As NumberRows, @dCurrentTime as Time_stampFROM ( SELECT lID FROM SessionState_Variables WHERE dLastAccessed BETWEEN @dCurrentTimeMinus5 AND @dCurrentTime GROUP BY lID) As SessionsTableRegards

SQL Server 2008R2 linked from a SQL2012 server problem

Posted: 11 Jul 2013 05:51 AM PDT

I have a 2012 server with link to a 2008r2 server. When I run a simple select from the 2012 query window and specify the fully qualified linked server (2008r2) the query runs fine.When I log into the 2008R2 server, launch SSMS (2008R2) but connect to the 2012 server and run the same query i get the 18456 'NT Authority\Anonymous Logon' error.Further, When I connect to the the 2012 server from a client PC using SSMS (2012), I get the same error when I run the query from the linked 2008R2 server. Using the fully qualified name.Any thoughts?Thanks.Joe

StarQuest Data Replicator (SQDR) - anyone use it? anyone even heard of it?

Posted: 11 Jul 2013 06:23 AM PDT

I'm setting-up a proof of concept where I need to replicate data between DB2 on Z/OS, DB2 LUW on Windows, and MS SQL Server 2008. We actually already have IBM WebSphere MQ and DB2 Replication (SQL/Q Replication) in place to do this, but given it's in need of upgrading I'm investigating other options in-case there's something simpler to maintain.StarQuest Data Replicator seemed to bubble up to the top of my Googls search, and I've found a few artles and questions about it sprinkled around various DB2 and MS SQL forums. Problem is the product description is exactly what we're looking for, and given the company has been around for over a decade I'm shocked I haven't heard of it nor can I find much about them on any tech forum.Brian Knight did [url=http://www.sqlservercentral.com/articles/Distributed+Queries/starsql/167/]an article[/url] about it back in 2001, but that's all I can find on here. Any thoughts on this? Thanks for any suggestions.Sam

how i can identify local windows service that coresponds to sql spid?

Posted: 11 Jul 2013 01:29 AM PDT

Good morning Everyone,Our environment: sql server 2008 r2 sp2 on windows 2008 r2 enterprise sp1, 2 node active/passive cluster; 200 (500mb databases)Approximately every 10-15 minutes i see multiple (per database) connections are being established…those would run for a minute and disconnectIt's not really causing any performance issues….but, 600-800 connections to server that comes and goes are driving me crazy.I cannot identify WHERE are they coming from….i do see that some local process initiating them…cannot figure out which one.All I see is sql User: NT AUTHORITY\SYSTEMhost name : local physical node nameProgram: .Net SqlClient Data ProviderNet Library: TCPWin Domain : NT AUTHORITYWin User: SYSTEMI see that these are just sql server related information collections and we do have 3rd party monitoring tools, which I currently disabled.Can these be just a sql server native status info collections?????And these are the query that it runs:Connection1:SELECT [dbfiles].[physical_name] AS DBFilesPhysicalName,[mediafamily].[physical_device_name] AS BackupFilesPhysicalName FROM msdb.dbo.backupmediafamily [mediafamily] JOIN (SELECT [backupsets].media_set_id, [backupsets].database_name FROM msdb.dbo.backupset [backupsets] JOIN (SELECT TOP 1 [backupsets].media_set_id, [backupsets].database_name, [backupsets].backup_finish_date FROM msdb.dbo.backupset [backupsets] JOIN sys.databases [databases] ON [backupsets].[database_name] = [databases].[name] and [databases].[is_in_standby] = 0 AND [databases].[source_database_id] IS NULL AND ([databases].name = 'master' OR DATEDIFF ( ss, [databases].[create_date] , [backupsets].[database_creation_date] ) = 0 ) WHERE [databases].[database_id] = DB_ID() AND [backupsets].type = 'D' ORDER BY [backupsets].backup_finish_date DESC) AS [latest_backupset] ON [backupsets].[database_name] = [latest_backupset].[database_name] AND [backupsets].[backup_finish_date] >= [latest_backupset].[backup_finish_date]) AS latest_backups ON [latest_baConnection2:select * from sys.master_files where database_id = '20' and type in (0,1)Connection3:(@DatabaseName nvarchar(10))IF (Exists(SELECT * FROM(SELECT name as 'DatabaseName' FROM sys.databases WHERE name NOT IN ('master','model','tempdb','msdb' ) and source_database_id IS NULL) AS UserDatabase WHERE UserDatabase.DatabaseName = @DatabaseName))BEGIN SELECT distinct obj.name FROM sys.objects AS obj JOIN sys.stats stat ON stat.object_id = obj.object_id CROSS apply sys.dm_db_stats_properties(stat.object_id, stat.stats_id) AS sp WHERE obj.type = 'U' AND modification_counter > 1000 END

Default trace enabled

Posted: 11 Jul 2013 03:57 AM PDT

Hi Friends,How to clear older logs captured in default trace?

backup via .mdf and .ldf ok?

Posted: 11 Jul 2013 02:12 AM PDT

I was asking the network/system guy whether a third-party app's databases were being backed up. He did some checking and said "Yes" but I suspect he meant that it's the .mdf and .ldf files that are being backed up (copied). A quick search did not find any .bak files for those databases.I think I know (at least part of) the answer, but wanted to get some thoughts from people on this forum.1) is it OK to rely on a copy of the .mdf and .ldf files as a backup?2) if the current version becomes corrupt can you go back to a previous copy and restore that?

conditional table selection

Posted: 11 Jul 2013 12:18 AM PDT

how to select particular table after condition is true?i want to select table only if it has rows more than two else i want to print less number of rows ?how can i solve this problem ?Thank you

Deadlock on Import

Posted: 10 Jul 2013 09:14 PM PDT

Good DayWe try to import data into a tabe from a file , but we get deadlocks on insert . We added indexes before and we suspected they were the cause and we removed them . But this did not solve the problem. I checked for open transactions using dbcc opentran and it showed that the replication has an open transaction . We did activate all the traceflags as per SQL Server guidelines on the web . We get the table that is the victim, but we cannot identify the process that causes the problem . Any ideas ? We are a bit desperate. Thanks in advance

Is this a crazy way to migrate from one OS version to another?

Posted: 11 Jul 2013 12:06 AM PDT

I need to migrate all the servers I support from Windows Server 2008, to 2008 R2 by the end of the year (plenty of time, really.) All the servers are virtuals, and all of them are running SQL 2008 R2. Seeing as they're configured with an OS drive, and multiple data / SQL Server drives, I had a thought on a possible way to migrate quickly.Of course, this method does have the downside of being one way, with no quick way to go back if it doesn't work...My thought is, set up a new VM with just an OS drive and the OS. Once this is ready, on the current server, detach all the DBs, and stop the SQL service (the reason for the detach will become apparent.) Have the VMWare admins then remove the data drives from the "old" server, and attach them to the new. Once the drives are attached, delete EVERYTHING except where the DBs / TLogs live, install SQL 2008 R2 as normal and update as needed, then proceed to attach all the DBs I detached.My thinking (if this idea has merit) is this gives me a few advantages:1. I don't need to worry about the VMWare / Storage admins complaining about space, I'm just switching already used space from one VM to another...2. I think this might go a bit quicker than backing up each DB on the old, copying the backup to the new, then restoring.Of course possible problems:1. If this doesn't work well, or a DB doesn't come up, my only fall back is a backup of the DB.2. If it takes longer than expected, I may have users hitting me with "why isn't it done yet, what's taking so long..."I'm mostly interested in whether people think this would be a way to do this. If you ignore the VM part, it'd be no different than buying a new server, installing the OS, then pulling out the drives from your old server with the DBs on them and installing them in the new...Of course, a better solution would be if I could get the VMware admin to "clone" the data / log drives and attach the clones to the new VM. But that potentially brings in a fight with the storage guys...Thanks,Jason

SSMS - Databases are not Visible

Posted: 10 Jul 2013 08:19 PM PDT

Hi Team,After connecting SSMS [b]'Databases, Security, Server Objects, Management' [/b] folders are not showing in the object explorer, again i need to connect to sql server with login credentials then only above folders are showing.Is there any configuration required to display the above folder automatically once login to SSMS.Please suggest...

Require help to get number of hits to sql server 2008 R2 in a day

Posted: 10 Jul 2013 11:17 PM PDT

Hello All,Please help me to find out number of hits to sql server 2008 R2 in a day.Thanks,Sumit Rastogi:-)

Update to Base Table from Details Table

Posted: 10 Jul 2013 11:02 PM PDT

Create Table #Customer_Profile_Master1 ( Lnno Varchar(15), Co_apcode_1_categ Varchar(200), Co_apcode_2_categ Varchar(200), Co_apcode_3_categ Varchar(200), Co_apcode_4_categ Varchar(200) ) Insert Into #Customer_Profile_Master1(Lnno) Values ('1') Select * from #Customer_Profile_Master1 Drop table #Temp1 Create Table #Temp1 ( Lnno Varchar(15), CustCode Varchar(200), Co_apcode Varchar(200), Category Varchar(200), Rownumber Int ) Insert Into #Temp1 Values ('1', 'M0000036', 'A0000026', 'SALARIED', 1), ('1', 'M0000036', 'A0000027', 'SELF EMPLOYED PROFESSIONALS', 2), ('1', 'M0000036', 'A0000028', 'STUDENT', 3) Select * from #Temp1 My Requirement is I want to update in #Customer_Profile_Master1 table from #Temp1 as Co_apcode_1_categ = For that Lnno if Rownumber = 1 then corresponding Category should be updated Co_apcode_2_categ = For that Lnno if Rownumber = 2 then corresponding Category should be updated Co_apcode_3_categ = For that Lnno if Rownumber = 3 then corresponding Category should be updated Co_apcode_4_categ = For that Lnno if Rownumber = 4 then corresponding Category should be updated I had tried this way but it is not working for me, Update #Customer_Profile_Master1Set Co_apcode_1_categ = Case when B.Rownumber = 1 then B.Category else '' End, Co_apcode_2_categ = Case when B.Rownumber = 2 then B.Category else '' End, Co_apcode_3_categ = Case when B.Rownumber = 3 then B.Category else '' End, Co_apcode_4_categ = Case when B.Rownumber = 4 then B.Category else '' EndFrom #Customer_Profile_Master A Join #Temp1 BOn A.Lnno = B.LnnoPlease help me as I need this in urgent.Thanks in Advance!

Analysis Services in Named Instance

Posted: 10 Jul 2013 09:32 PM PDT

I have set up a 2008R2 SQL cluster with several named instances and a default instance. The application using SQL Server requires analysis services, which I installed with his instance. The application has no problem connecting to the database but is unable to connect to analysis services. I set up a specific service ID for this purpose. What should we be doing to enable a connection?

Filegroup

Posted: 10 Jul 2013 08:39 PM PDT

Hi All,Please let me know whether it is possible to use the same secondary filegroup for 2 different databases.I tried manually and it did not allow.Just wanted to confirm the same.

No comments:

Post a Comment

Search This Blog