[SQL Server 2008 issues] Data Conversion Error |
- Data Conversion Error
- Trace Flag 8048, 8015, or SUMA?
- result of adding non-clustered non-unique index vs clustered one
- KILLED/ROLLBACK STATE - SERVICE RESTART
- XML Data Type as a parameter in a Stored Procedure
- existing column Level encryption in sql server2008
- venkatrao.m.cse@gmail.com
- First letter upper case for each word in sql server.
- Anyone help to script the linkedservers on weekly basis by automatically
- Help needed in SQL Server Replication
- File Groups advatages
- Cannot scan a remote server with SQL Server BPA 1.0
- TSQL Optimization - CASTing and CURSOR
- SSRS 'Render' and TempDB Growth
- Performance issue
- SSRS 2008R2 showing counties (not just states)
- Same stored procs and different schemas = issues?
- Websockets?
- Stored procedures and using the Exec statment on a dynamic string of SQL
- SSIS Conditional Split not working when run on SQL 2008 R2 using SQL Server Agent
- Convert Row into Columns
- Top N makes query run faster
- Checklist after Migrating Databases from 2005 to 2008 R2
- Monitor SSIS Job
- ADP - passing value from a form to a query
- Change the Date Format
- Rebuilt Index
- Restore msdb
- How to interprete the unused space ?
- Server drop connection error
- reduce resource waits on sql server without upgrading sql server
- Restore multiple databases
- SQL architecture
Posted: 20 Mar 2013 07:09 PM PDT I have a staging data which holds all data as NVARCHAR, I am now loading from the staging table to specific tables where I convert various fields to numerics, but I'm getting a conversion error. Is there any easy way of finding which row is causing the problem. I've searched through the forum for this error but couldn't find anything that matches my specific needs. |
Trace Flag 8048, 8015, or SUMA? Posted: 14 Nov 2012 08:37 AM PST Hola! Recently included SQL Server startup Trace Flag 8048 to resolve a serious spinlock contention issue in a SQL Server 2008 R2 system. Interested to hear from others who have found usage cases where performance value was delivered by trace flag 8048 (promote query memory grant strategy from per-NUMA node to per-core), trace flag 8015 (SQL Server ignores physical NUMA), or SUMA (interleaved sufficiently uniform memory access). Trace flag 8048http://blogs.msdn.com/b/psssql/archive/2011/09/01/sql-server-2008-2008-r2-on-newer-machines-with-more-than-8-cpus-presented-per-numa-node-may-need-trace-flag-8048.aspxTrace flag 8015http://blogs.msdn.com/b/psssql/archive/2010/04/02/how-it-works-soft-numa-i-o-completion-thread-lazy-writer-workers-and-memory-nodes.aspxSUMA, or interleaved memoryhttp://msdn.microsoft.com/en-us/library/ms178144(v=sql.105).aspx"If you have hardware NUMA, it may be configured to use interleaved memory instead of NUMA. In that case, Windows and therefore SQL Server will not recognize it as NUMA."Gory details of system workload, gathered metrics from troubled system, and gathered metrics from the system after intervention in posts to follow. Peace!tw: @sql_handle |
result of adding non-clustered non-unique index vs clustered one Posted: 20 Mar 2013 07:20 PM PDT We're testing some rather large - at least for us - narrow tables that will be populated with between 200 and 500 million records.Any access to the table will be by addressing a low-cardinality id (some 20-50 distinct values) Without the option of partitioning we test some index scenarios.The table:Id1 (high cardinality)DatekeyNameValueId2 (low cardinality; always used in where clauses in queries)When adding a non-unique non-clustered index , the index is only used when additional columns are included. The index space is then larger than the table.When adding a non-unique clustered index , the index is always used (when Id2 is adressed) and index space is minimalWith DB2 as background and being used to Bitmap indices i'm trying to understand SQL Server's approach. The clustered index seems ideal, but what is the catch? |
KILLED/ROLLBACK STATE - SERVICE RESTART Posted: 20 Mar 2013 06:57 PM PDT Hi all,Is there any fix other than sql service restaRT when there is transaction stuck in killed/rollback state (after killing a blocked transaction) in sysprocesses?i had this situation and once i unsucessfully tried moving the database to offline mode(courtesy:google) to end the transaction.any help will be appreciated. |
XML Data Type as a parameter in a Stored Procedure Posted: 20 Mar 2013 07:06 AM PDT Hi,I've table as follows,[code="sql"]CREATE TABLE [dbo].[majikanAG_subMajikan_1]( [idx] [int] IDENTITY(-2147483648,1) NOT NULL, [batch_Id] [uniqueidentifier] NOT NULL, [icNo (Baru)] [varchar](100) NULL, [icNo (Lama)] [varchar](100) NULL, [payerNme] [varchar](300) NULL, [zakatAmount] [decimal](10, 2) NULL, [subMajikan] [varchar](100) NULL, CONSTRAINT [PK__majikanA__51EFEBF8002AF460] PRIMARY KEY CLUSTERED ( [idx] ASC, [batch_Id] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY][/code]My Stored Procedure as follows,[code="sql"]CREATE PROCEDURE [dbo].[addAGSummary_SubMajikan_Process1]@agItem xml,@batch_Id uniqueidentifier outputASBEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON;set transaction isolation level repeatable readBegin transactionBegin Tryselect @batch_Id=NEWID()insert into majikanAG_subMajikan_1(batch_Id, [icNo (Baru)], [icNo (Lama)],payerNme, zakatAmount, subMajikan)select @batch_Id,a.b.value('icNo[1]','varchar(200)') as icNo, --as input1,a.b.value('icNoOld[1]','varchar(15)') as icNoOld, --as input2,upper(a.b.value('payerNme[1]','varchar(100)')) as payerNme, --as input3,--a.b.value('amt[1]','decimal(10,2)') as amt, --as input4,a.b.value('amt[1]','varchar(100)') as amt, --as input4,a.b.value('subCd[1]','varchar(100)') as subCd --as input5,from@agItem.nodes('/data/ag') a(b)COMMIT transactionEnd TryBegin Catch-- Whoops, there was an error--IF @@TRANCOUNT > 0ROLLBACK transaction-- Raise an error with the details of the exceptionDECLARE @ErrMsg nvarchar(4000), @ErrSeverity intSELECT @ErrMsg = ERROR_MESSAGE(),@ErrSeverity = ERROR_SEVERITY()RAISERROR(@ErrMsg, @ErrSeverity, 1)End Catch END[/code]There are 2 scenario1- If @agItem did not have so much data (1000 records), the stored procedure run well2- If @agItem have a so much data (10000 records), the stored procedure cannot process as expectedWhy no (2) is happening? Did XML Data Type is not suitable for 10000 records?Please help. I'm stuck |
existing column Level encryption in sql server2008 Posted: 18 Mar 2013 04:32 PM PDT Hi, how to set the column level encyption in sql server 2008 in production enviroment without drop existing column and what are method available for encrypt the data in sql server 2008 standard edition. |
Posted: 20 Mar 2013 06:09 PM PDT Hi,tempdb full on solvereply me |
First letter upper case for each word in sql server. Posted: 09 Dec 2011 06:40 AM PST hello All,I want to write down stored procedure or function that will return or generate first letter in Upper case andrest of the letter in lower case.I already have table with Facility with facility name.(I have 50 facilityname like this and I want to update this table)Example :Facility Name : ABINGTON HEALTH LANSDALE HOSP and I want a outputFacilityname : Abington Health Lansdale HospThanksBhavesh |
Anyone help to script the linkedservers on weekly basis by automatically Posted: 20 Mar 2013 05:51 PM PDT Hi ,Anyone help to script the linkedservers on weekly basis by automatically in text or .sql format file Regards,Saran |
Help needed in SQL Server Replication Posted: 20 Mar 2013 05:50 PM PDT I have got History tables to track the changes and for those, the Primary Key is not there. Also there are some stored procedures in my application which are encrypted and can not be exposed to end-users. When I try to do the SQL Server Replication methods, none of the methods working for my case. Is there any way to achieve Replication or do I need to find an alternative method(please suggest me in that case)?Expecting your support at the earliest. |
Posted: 20 Mar 2013 05:48 PM PDT hi,We have a best practice in BOL that "We need to have a separate File Group for the User data".Is the below statement is true:If there is any corruption while writing the data to PRIMARY File Group's files the database will not available to access. We can make the Database available by adding Secondary FG and placing the data to this Group.please clarify. |
Cannot scan a remote server with SQL Server BPA 1.0 Posted: 20 Mar 2013 05:34 PM PDT Hello gentsI am using Microsoft SQL Server 2008 R2 BPA 1.0 wrapped in Microsoft Baseline Configuration Analyzer 2.0 on my workstation and can perform normal scan on my local sql server instance. However, when I tried it to connect to a remote server, it kept reporting the following error:Category: PrerequisiteSource: <servernmae>Issue: 1. User is not a member of Administrators group on remote machine, OR 2. Powershell Remoting is not Enabled on Remote ServerImpact: Analysis cannot be performedResolution: 1. Add user as a member of Administrators group, OR 2. Run Commands Enable-PSRemoting -f through PowerShell command prompt using elevated privileges, AND 3. Run Command winrm set winrm/config/winrs `@`{MaxShellsPerUser=`"10`"`} through PowerShell command prompt using elevated privilegesI've verfied all three pre-requisites (being a local admin for my own AD account, executed the ps commands with elevated privileges) listed above and also turned off Windows Firewall on this target server but still have no luck at all.Do you have any other directions to point out for me?Thanks in anticipation!P.S. The target server is not clustered, just a standalone physical box; both my worksation and the server are in the same domain; my AD account has been explicitly added to local windows admin group and sysadmin role on the server and its hosted sql instance. |
TSQL Optimization - CASTing and CURSOR Posted: 20 Feb 2013 11:59 AM PST hi,Is CASTing cost an additional overhead to the query? I need to optimize our database initialization and i reviewing some of the sql scripts and found out that there are several sp using this CAST( NULL AS VARBINARY(MAX)) in defining their column. If they cast a null value then it results is always a null value so why they still need to cast it? This sp is called a million times (>7millions calls).How about a declaring a cursor, what it is overhead cost in terms of performance. They use it in one of their scalar function and being called also several times. this function compares the value of previous record to the current one. the loop will only pass on less than 10 records, and reading to a table containing a millions of records.any comments, suggestion.thanks. |
SSRS 'Render' and TempDB Growth Posted: 20 Mar 2013 05:36 AM PDT I have a client that beginning two days ago is having an issue with TempDB growing extremely fast and running out of space. I launched the SQL Profiler while it was happening and noticed that most of the sessions were SSRS reports. I looked at the SSRS log and noticed a report running a the same time as the TempDB warnings. The SSRS log showed the report as Item Action = 'Render'. What does this mean? The report itself is based on SQL query that is based on views. The views are not complex, however, I noticed that one of the underlying tables could benefit with and additional index. I studied the query plan and determined that.If this were a report based on a stored procedure using a lot of temp table and such I would think it was that. But it's this query.Any thoughts?Thanks.Joe |
Posted: 19 Mar 2013 10:58 PM PDT Hi,I have two tables.treatment_plan table having 8 fields like txt_instruct_Description_1,txt_instruct_Description_2....As per requirement txt_instruct_Description_4,txt_instruct_Description_5,txt_instruct_Description_6,txt_instruct_Description_7,txt_instruct_Description_8 field values should be moved to another table custom_plan.custom_plan table having ,txt_addition_description_1,txt_addition_description_2,txt_addition_description_3,txt_addition_description_4,txt_addition_description_5,txt_addition_description_6,txt_addition_description_7,txt_addition_description_8 fields.I want to move treatment_plan table 5 fields value to custom_plan 8 fields.I have written the following the query.But this query taking more time to execute.Is there any other way to improve the performance or changing the code to execute fast. Please let me know.-------------------------------------------------SET NOCOUNT ON IF EXISTS (SELECT 1 FROM SYSOBJECTS SO WHERE SO.NAME = 'custom_plan') IF EXISTS (SELECT 1 FROM SYSOBJECTS SO WHERE SO.NAME = 'treatment_plan') BEGIN declare @num_source_field int declare @num_destination_field int declare @source_field varchar(100) declare @destination_field varchar(100) declare @src_value varchar(75) declare @dest_value varchar(75) declare @strsql varchar(1000) SET @num_source_field = 4 SET @num_destination_field = 1 select *, id = IDENTITY(INT,1,1) into #temp from treatment_plan Declare @mx int declare @mn int Declare @encid varchar(45) select @mx=max(id),@mn=min(id) from #temp create table #tbl(col1 varchar(45)) while(@mn<=@mx) BEGIN select @encid= enc_id from #temp where id=@mn SET @num_source_field=4 while(@num_source_field <= 8) BEGIN select @source_field = 'txt_instruct_description_'+cast(@num_source_field as varchar(2)) SET @num_destination_field = 1 while(@num_destination_field <= 8) BEGIN select @destination_field = 'txt_additional_description_'+cast(@num_destination_field as varchar(2)) truncate table #tbl SET @strsql='insert into #tbl select '+@source_field+' from treatment_plan where enc_id='+''''+@encid +'''' --EXECUTE sp_executesql @strsql exec(@strsql) select @src_value= col1 from #tbl truncate table #tbl SET @strsql='insert into #tbl select '+@destination_field+' from custom_plan where enc_id='+''''+@encid +'''' --EXECUTE sp_executesql @strsql exec(@strsql) select @dest_value= col1 from #tbl if(@dest_value is null) begin SET @strsql='update custom_plan SET '+@destination_field+'='+''''+@src_value+''''+' where enc_id='+''''+@encid+'''' --EXECUTE sp_executesql @strsql exec(@strsql) break end SET @num_destination_field=@num_destination_field+1 END SET @num_source_field=@num_source_field+1 END SET @mn=@mn+1 END drop table #tbl drop table #temp END Print '----------End----------'SET NOCOUNT OFFGO--------------------------------------------------------Thanks,Tony |
SSRS 2008R2 showing counties (not just states) Posted: 08 Mar 2013 01:53 PM PST I have a hopefully simple question. SSRS 2008R2 can do maps by state just fine... is there an easy way to get City>County>State rollup data somewhere or get shapefiles for Counties in the US? I'm working on a database for someone that would be infinitely more useful if I could show like a heat map for sales by county across the country... there's a really cool map [url=http://www.mssqltips.com/sqlservertip/2552/creating-an-ssrs-map-report-with-data-pinpoints/]here[/url] that shows the county lines in it... and that's the part I want - the Counties. The granularity of the data I have is not too good, so county-level or so is about right.Is there an easy way to create a map like that? (like a color-coded map from election night, but a county-by-county instead of state-by-state?) If so, how? And where would I get the shapefiles for the counties?Thanks!Pieter |
Same stored procs and different schemas = issues? Posted: 20 Mar 2013 02:40 PM PDT I have two stored procs which are identical, except they exist in 2 different schemas. Stored proc A in schema dbo references objects, views, tables in the dbo schema. (all of which are named properly using the schema preface). Stored proc B in schema ABC references objects, views, tables in the dbo schema (since it's the same as the other one). The only difference between these two schemas is their name, since of course, they exist in 2 schemas. (This was done soley for organizational purposes, and the original A dbo sproc was going to go away.)So. Here's where it gets odd.I expected them both to run/perform the same way. However, both were set up to run as a step in a job in the SQL Agent. Stored proc A, which is in the dbo schema runs perfectly fine. Stored proc B does not consistently perform fine. It has issues. Runs forever, won't finish, ends up locking things up.I'm confused in why code running from two different schemas operates differently, despite the proper naming conventions between them both. As a fix, i had to comment out the step that references schema B and just let the step with the schema A stored proc run.Are there any ideas out there of what may be going on or what I might check out? (And if I've left out any details that would help research this, do let me know.)Any help would be appreciated. |
Posted: 13 Mar 2013 11:54 PM PDT Anyone has experience with SQLServer and websockets? I need my Stored Procedure to be able to send a message to clients through nodeJS.Thanks, |
Stored procedures and using the Exec statment on a dynamic string of SQL Posted: 20 Mar 2013 05:51 AM PDT sometimes a sp can get out of hand if this if that etc where I saw a developer go over 9k lines.allot of the code was re-use, the fields, the joins etc, just the where statement would changeso in all the sp just declares the fields and join in two varchar variablesthe ifs at the top build the if in 8950 less lines of code. Just by consolidating, I noticed she made so many mistakes because at this length it becomes almost unmanageable.and just ends up doingexec @sql1 + @sql2 + @sql3my question is performance, is this now an uncompiled TSQL statement and less powerful if all the code was "if" out naturallyP.S. I also noticed a downfall with exec, reporting services cannot pickup the fields and you have to either do it manually or cheat and put a sql statement up top and then change it on the backend |
SSIS Conditional Split not working when run on SQL 2008 R2 using SQL Server Agent Posted: 20 Mar 2013 06:16 AM PDT I'm having an odd problem after upgrading our SQL 2008 server to 2008 R2: An SSIS package containing a conditional split is not passing rows through its outputs according to the set conditions when executed via a SQL Server Agent job. The SSIS package operates normally when run from Visual Studio, and it runs normally when executed via Management Studio connected to the SSIS instance--it only has issues when run via the SQL Agent. Anyone experience this before or have ideas on what to check?Note that I did not yet upgrade the SSIS package, so it's still in the SQL 2008 format (not R2).Current SQL Server 2008 R2 version: 10.50.1600 |
Posted: 20 Mar 2013 07:32 AM PDT Hi All,I've a table that has data in the particular formatManagerID | EmployeeID1001 | 99901001 | 99911002 | 99931002 | 99941003 | 99951003 | 99961003 | 99971003 | 99981003 | 9999I would like to get the results asManagerID | EmployeeID1 | EmployeeID2 | EmployeeID31001 | 9990 | 9991 | null1002 | 9993 | 9994 | null1003 | 9995 | 9996 | 9997If you notice correctly ManagerID 1003 has 5 employeeid but i need only 3 of them in ascending order...Thanks in advance |
Posted: 17 Mar 2013 11:13 PM PDT I have a select query which brings back 35k rows. it runs in approx 5 seconds.I moved this to a new server and it runs for around an hour before i give up and kill the process.I put in select top 35000 .......... at the start of the query and i can get it to run in just under 4 minutes.The query runs across two servers to fetch the data using a left outer join to pull back everything from server 1 regardless of a match on server 2Where do i start looking to find the bottle neck ?I've attached the plan. |
Checklist after Migrating Databases from 2005 to 2008 R2 Posted: 20 Mar 2013 03:48 AM PDT I am migrating SQL databases from 2005 to 2008 R2 Enterprise edition. There are only 5 databases but they all are 500 GB or more. I would like to know what checks to make on the dbs after they are moved to the new server. Great hardware on the new server. We have a very small maintenance window and most of the time will be going towards moving databases, testing jobs etc. I am considering running following things in this order, but worried about the time it takes on such a large databases and performance impact. DBCC UPDATEUSAGEDBCC CHECKDBREBUILD/REORGANIZE INDEXESUPDATE STATISTICS (for Indexes that were ReOrganized)RECOMPILE All the procs.Can anyone please provide expert comments if we really need all these or not? Thanks in advance... |
Posted: 20 Mar 2013 12:48 AM PDT We have a SQL Job that calls an SSIS package. This package used to take about 10 minutes to run. A while back it started to take 1.5 hours and still does. We are able to narrow it down to which task in the control flow is taking the longest.I'm very new to SSIS, but I have been asked to look in to it. So I guess I am wondering if anyone knows where I would start? I would like to some how monitor what goes on but the job runs at 3:30AM so I never get to see it till it's done running. Can I create an audit to record events for me to look at later? Is there a way to tell if tables are blocking/locking one another after the job has already ran? I can run the step again but it runs very quickly by itself so it's not that helpful.Below is the step that takes a long time, it's just a execute sql task with a query.--CREATE TABLE #FullList(DECLARE @FullList table ( StoreProductId INT, StoreId INT, ProductId INT, NewTaxRate NUMERIC (7,6), BottleDeposit MONEY)--Insert values into #fullListINSERT INTO @FullList(StoreProductId, StoreId, ProductId , NewTaxRate , BottleDeposit)SELECT tblTaxLoad.StoreProductId,tblTaxLoad.fkStoreId, tblTaxLoad.fkProductId, tblTaxLoad.NewTaxRate, tblTaxLoad.BottleDeposit FROM dbo.tblTaxLoad WITH (NOLOCK)ORDER BY tblTaxLoad.StoreProductId-------------------------------------------------------------------------------------------Update Taxes / Deposits 100 at a timedeclare @myIteration intdeclare @myCounter intset @myIteration = 0set @myCounter = 0----------------------------------------------------------- Take 100 items at a time & put them in temp table #SubList--WHILE ( SELECT COUNT(*) FROM @FullList ) > 0 BEGIN SET @myCounter = ( SELECT COUNT(*) FROM @FullList ) --------------------------------------- -- Get next 100 items -- SELECT TOP 100 StoreProductId, StoreId, ProductId , NewTaxRate , BottleDeposit INTO #SubList FROM @FullList ORDER BY StoreProductId --------------------------------------- -- Update these items -- -- begin tran UPDATE tblStore_Product SET InStoreSalesTaxRate = NewTaxRate, Deposit = BottleDeposit --select * FROM #SubList SubList INNER JOIN dbo.tblStore_Product WITH (NOLOCK) ON SubList.StoreProductId = tblStore_Product.[Id] --commit -- rollback --------------------------------------- -- Report to screen -- set @myIteration = @myIteration + 1 print 'Iteration ' + cast(@myIteration as varchar) + ': ' + cast(@myCounter as varchar) + ' left' --------------------------------------- -- Remove updated from #FullList table & LoadTable -- DELETE FullList FROM @FullList FullList INNER JOIN #SubList ON FullList.StoreProductId = #SubList.StoreProductId DELETE tblTaxLoad FROM dbo.tblTaxLoad INNER JOIN #SubList ON tblTaxLoad.StoreProductId = #SubList.StoreProductId --------------------------------------- -- Drop temp table (to be remade again) -- DROP TABLE #SubList WAITFOR DELAY '00:00:00.200' -- .2 second between each iteration END----------------------------------------- Drop temp table -- @FullList will simply go out of scope--DROP TABLE #FullList |
ADP - passing value from a form to a query Posted: 20 Mar 2013 07:27 AM PDT Hi all,I am trying to develop a small database for our company. Although I come from a computing background I an not a database expert. After looking at some options I decided to try access ADP. By way of learning both Access and SQL Server I started going through tutorials on accessallinone.com site. First as regular Access database, then as an ADP. :-)) I use Access 2010 and SQL Server 2008 express. Well, I got stomped right off the bat. In "09 – Select Queries (Criteria)" tutorial a value from a text field on a form is passed to a query. It works fine in Access-stand-alone, of coarse; but not in ADP. Here is the query as seen in ADP (the last line is my attempt to re-create what I see in stand-alone version as the query builder does not prompt for the Forms):SELECT StudentID, LastName, FirstNameFROM dbo.tblStudentsWHERE (LastName = Forms!FrmLastName!TxtLastName) Any insights into how to achieve this will be greatly appreciated! |
Posted: 20 Mar 2013 07:36 AM PDT Good evening,Please, my sql server Database has a table which contains this field: [b]Date_Start(Datetime, Null)[/b]. The format of the data in the table is yyyy/mm/dd. I have tried to export this table in Excel through SIS. Although I have used this fonction [b]Convert(Date,Date_Start,103)[/b] the date format is still yyyy/mm/dd instead of dd/mm/yyyy in my Excel File. please what can I do to obtain the correct date format dd/mm/aaaa in my excel file after data exportation from my table through SIS?Thanks in advance. |
Posted: 20 Mar 2013 05:40 AM PDT Dear experts,Our application is designed to support multiple SQL server version from 2000-current.One of the thing the application does is it imports data to db on schedule basis, depends on each client, some imports may take couple days to complete (yes no mistake)During the course of an import indexes may be very fragmented and import performance degrades. to overcome this issue we have index status check periodicly to dermine if indexes need to be rebuilt if so it rebuilds fragmented indexes automatically.The way we handle this is we first check sys.dm_db_index_physical_stats and determine which indexes needed to be reindexed, and use the following logic to defrag them:[code="sql"]If SQL version=Enterprise ALTER INDEX [indexX] ON [tableY] REBUILD WITH (FILLFACTOR = 80, ONLINE = ON);Else DBCC INDEXDEFRAG (0,'indexX','tableY') WITH NO_INFOMSGS;End[/code]My question to you is: Do you think what we have is sufficient or is there away to avoid using DBCC as my understanding DBCC is old method that may do thing less efficient?Any suggestion to impove would be very appreciate. |
Posted: 20 Mar 2013 07:14 AM PDT In SQL 2008 R2, the msdb database can be restore from a backup correct, a record in the table sysdtspackages90 was remove and instead of inserting the information back in I thought it would be easy to do a restore. |
How to interprete the unused space ? Posted: 20 Mar 2013 01:35 AM PDT In the properties of a user database, we can obtain the database size and the available space. We can shrink the database such that the available space is zero. However, if we execute sp_spaceused against this database, we can see the unused space is not zero. How to interprete the unused space in executing the stored procedure?Is there any way in removing the unused space?Many thanks in advance for any input. |
Posted: 16 Jul 2012 08:37 PM PDT Error: 17886, Severity: 20, State: 1. The server will drop the connection, because the client driver has sent multiple requests while the session is in single-user mode. This error occurs when a client sends a request to reset the connection while there are batches still running in the session, or when the client sends a request while the session is resetting a connection. Please contact the client driver vendor. [Policy: MSSQL Logfile Template]Any suggestions |
reduce resource waits on sql server without upgrading sql server Posted: 20 Mar 2013 04:06 AM PDT Hi Experts,How to reduce resource waits on sql server without upgrading CPU??Thanks,Nihar |
Posted: 26 Oct 2012 01:11 PM PDT Hi All,I have to take backups of 100+ databases from one server and restore it to another server.I have done the backups using a job and it went pretty smoothly.Now, my question is can I do the restore all these DBs using a job as well.I have done restore using jobs before but this volume.Does anyone have a good script for this type of restore or any advise or suggestions would be much appreciated.Thanks,SueTons. |
Posted: 17 Mar 2013 05:27 PM PDT Can any one explain the SQL architecture |
You are subscribed to email updates from SQLServerCentral / SQL Server 2008 / SQL Server 2008 - General To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
No comments:
Post a Comment