Friday, May 3, 2013

[SQL Server 2008 issues] Shrinking the log file does not reduce size

[SQL Server 2008 issues] Shrinking the log file does not reduce size


Shrinking the log file does not reduce size

Posted: 29 Apr 2013 10:23 PM PDT

I have a database which had mdf size of 350 MB and ldf size 4.9 GBWhen i try to shrink the log file it's not shrinking. I know shrinking a database is not good and it should not be done. But still i am trying to do it for shrinking the log file. The recovery model is set to FULL.Then i followed some steps:When i run dbcc SQLPerf(logspace) i found that logsize is 4932 MB and Log space used is 98.76%So large amount of (98%) of log is using now.Then i run this command use <databasename> dbcc loginfoNow almost all VLF has status 2 which means all are in use.then I tried to take log backup. After log backup also shrinking didnt reduce the size. Then i changed recovery model to simple and then tried shrinking.But this also didn't help. Also i ran dbcc opentran (database)and found that no transaction is open now. So what is making the database which does not allow me to shrink the log size.How can i solve this?

Error: 17832, Severity: 20, State: 15.

Posted: 02 May 2013 06:40 PM PDT

Message: The login packet used to open the connection is structurally invalid; the connection has been closed. Please contact the vendor of the client library. IPXX.XX.XX.XXReceived 2 of these messages in the log 20 secs apart. Not to sure how serious this is.After doing a bit of research: found this[i]SQL Server error log entryRaised when an error log entry above a defined severity is written to the error log.Error severity levelsNote that a user may have explicitly caused an error to be written to the log using RAISERROR. Manually raised errors can be given any severity level, and the descriptions below will not necessarily apply.[/i]This is a list of codes not sure if it refers to severity or the state? 15 or 20?On further inspection I found out that the ip in the error message belonged to a 3rd party software vendor who has read access to a few of our tables. So something to do with their connection.[i]0-10: Information messages only.[b]11-16: Errors caused by users that can be corrected by users, for example, "table does not exist in database."[/b]17: Insufficient resources. Insufficient resources to carry out the statement, for example, run out of locks or no more disk space for the database.18: Non-fatal internal error. An internal error, that does not cause the connection to terminate, for example, stack overflow during compilation. Level 18 errors may occur, for example, when the SQL Server query processor detects an internal error during query optimization.19: SQL Server resource problem. A nonconfigurable internal limit has been exceeded (for example, log file is full) and the current batch is terminated. Severity level 19 errors occur rarely, but must be corrected. Note: Errors with a severity level of 19 or higher terminate the current batch.Severity levels 20-25 indicate system problems; these are fatal errors, indicating that the process is no longer running. The process records information about what occurred before terminating, and the client connection to SQL Server closes. Error messages in this range may affect all of the processes in the database, and may indicate that a database or object is damaged.[b]20: Fatal error on current connection. The current process has encountered a problem; this does not usually mean that database is damaged. Refer to Microsoft support sites for more information about your specific error.[/b]21: Fatal error on database. An error has occurred which affects all processes in current database. A severity level of 21 usually does not mean any database is damaged. You might have to review the contents of system tables and the configuration options to resolve errors of this severity.22: Table integrity fatal error. Not encountered very often; indicates that table integrity is suspect. Usually related to hardware, but the problem may exist in cache only, rather than on disk. Run DBCC CHECKDB to examine the integrity of all tables. Running DBCC CHECKDB with the REPAIR option may fix the problem. If restarting does not help, the problem is on the disk. Sometimes destroying the object specified in the error message can solve the problem. For example, if the message tells you that SQL Server has found a row with a length of 0 in a nonclustered index, delete the index and rebuild it.23: Database integrity fatal error. Problem with the integrity of the entire database; the database will be marked suspect. Run DBCC CHECKDB. These types of errors are usually caused by hardware issues. More than likely you will have to restore the database from backup. Run DBCC CHECKDB after restoring to ensure that the database is not corrupt.24: Media failure. Indicates some kind of hardware failure; you might have to reload the database from backup. Run DBCC CHECKDB to check database consistency first. You might also wish to contact your hardware vendor.[/i]Is this something I should be concerned about? Has anyone else received this message that can give me some insight into the severity of it? I'm going to contact the vendor to see what they were doing at that time.

Saving Circular and Rectangular path in geography data type

Posted: 02 May 2013 06:37 PM PDT

Hi all,I have an interface (google map) where i allow my users to define an area. Google map provides rectangle, line, circle and polygon to define a path or area. I have provided all the four options for my users to define the area. When my users use polygon or line, i get the relevant points and able to save the data as geography data type as well.SET @g1 = geography::STGeomFromText('LINESTRING(Points that i get from google)', 4326);SET @g1 = geography::STGeomFromText('POLYGON(points that i get from google)', 4326);For Circular path define, all i get is one point and radius. Similarly for rectangle. i get the adjacent points alone. Is there a way to convert these points to Geography data type.Thanks & RegardsKrishna

Peer-Peer replication

Posted: 01 May 2013 09:49 PM PDT

I have just created peer to peer replication on couple of test server.Let's say TestServer1 and TestServer2.And test database called Test_Peer.First i have create couple of tables and create new Peer-Peer Replication from TestServer1-> TestServer2.Publication: TestServer1_Peer_TestServer2_SP1.My question is, what if i have created new table and want to put them into Peer-Peer replication.Do i need to create new publication, configure peer-peer technology and also do i need to take backup every time (After creating table) before putting into peer-peer replication?

Help needed to design database for online ticket booking

Posted: 02 May 2013 05:35 PM PDT

I have to create local database and remote database.i want to update database when ever the changes comes(update both database)..example:in online ticket booking , user can book ticket on web and in cinema theater itself.whenever the user book a seat , need to update both databases.please suggest some ideas for this problem??my idea to keeping local and remote database may be wrong .please help me to find out a solution??????????????????????

Xml index

Posted: 02 May 2013 04:49 PM PDT

I have a table with following column.(I have directly give the create table command)CREATE TABLE [dbo].[BalanceTable]( [AccountID] [int] NOT NULL, [Type] [varchar](10) NULL, [Date] [date] NOT NULL, [Balance] [decimal](15, 2) NULL, [TRansactionDr] [decimal](15, 2) NULL, [TRansactionCr] [decimal](15, 2) NULL, [daily_Balance] [xml] NULL, CONSTRAINT [PK_BalanceTable] PRIMARY KEY CLUSTERED ( [AccountID] ASC, [Date] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]GOI have a xml query which updates this table. But it works slowly and take some seconds. I want to speed it up. So i was try to use xml index.So in this case which index i should use? Primary? secondary(in that i have path,value and property)? So which will be better to use which will reduce time required for execution.Below is queryUpdate BalanceTable set [daily_balance].modify('replace value of (/Root/Row[date=''2011-03-23'']/Balance/text())[1] with (/Root/Row[date=''2011-03-23'']/ Balance)[1] +1') where [AccountID]=26 and [Date]='31-Mar-2011' and [daily_balance].exist('/Root/Row[date=''2011-03-23'']')=1;Also here is sample of xml column<Root> <Row> <Rowid>0</Rowid> <date>2011-01-23</date> <Balance>0.0E0</Balance> <TRansactionDr>0.0E0</TRansactionDr> <TRansactionCr>0.0E0</TRansactionCr> </Row> <Maxrowid>0</Maxrowid> <Row> <Rowid>0</Rowid> <date>2011-01-31</date> <Balance>123</Balance> <TRansactionDr>123</TRansactionDr> <TRansactionCr>0.0E0</TRansactionCr> </Row></Root>So please give me suggestions

email and sms

Posted: 02 May 2013 01:54 AM PDT

I have this codeuse ITgoCreate Trigger Notifyon IT.WorkRequestfor insert ,update, deleteasbegin set nocount on; DECLARE @body NVARCHAR(MAX) = N''; SELECT @body += char(25) + char(25) + char(4) + char(255) + (Department) +' '+ (Problem) +' ' +(Completed) +' ' +(Notes) FROM inserted; IF EXISTS (SELECT 1 FROM inserted) BEGIN exec msdb.dbo.sp_send_dbmail @profile_name = 'Work', --@recipients = 'eddie.hall76@gmail.com', @recipients = '**********@txt.att.net ; edward_hall76@hotmail.com', @subject = 'Data Inserted into WorkRequest', @body = @body end endit sends the message to email with the information that was put inserted into the table but the message sent to the phone is blank any suggestions.

Creating Performance Dashboard reports on 2008 R2 and 2012

Posted: 02 May 2013 03:54 AM PDT

Hi All,I got a requirement to create dashboard reports for all SQL Servers. The requirement is to performance(CPU, Memory, bottlenecks, diskspace growth, costly queries etc etc) report to all DBA every morning. We might schedule this jobs every week or daily depending upon necessity.Can someone please guide me how to start on this ?Most of the servers are 2008 R2 and 2012. Also please let me know how does this Microsoft® SQL Server® 2012 Performance Dashboard Reports helps me in this scenario ?http://www.microsoft.com/en-us/download/details.aspx?id=29063Thanks all.

SQL SERVER 2000 Analysis Databases are not visible after P2V

Posted: 02 May 2013 08:34 AM PDT

Hi All,We recently moved one of our Server from physical to Virtual. This is SQL Server 2000, after migrating from physical to virtual we were not able to see analysis databases in Analysis Manager.Any Idea ?Thanks

SSMS Stored procedure help

Posted: 02 May 2013 10:49 AM PDT

Hi professionals.We have software written by a 3rd party vendor and I am trying to read one of there stored procedures under "programmablity", "stored procedures" and then right clicking on modify procedure.The problem is modify is grayed out and i cannot click it.So my 2 questions are1: Can this be read in any way or is it not possible.?2: How can I compile one of my procedures so that I can have it grayed out too.?Thanks

Low Batch Requests/Sec?

Posted: 20 Jun 2011 07:31 PM PDT

Hi AllSystem Info...Server. 12G RAM, 2xQuad Core processors. Hard Drive C (RAID 1), Hard Drive E (RAID 10). 1G NICSQL Server 2008 R2. All system DBs on Drive C. 1 User DB, with Log file on Drive C and Data file on Drive E.The application is Dynamics AX 2009 ;)I have a report which is taking > 30mins to execute! I started the Activity Monitor and notices that the batch request/sec was about 1500-2000! with low CPU utilisation. Looking at task manager, the Network adapter (1Gb) was being utilised ~3%.For who is not familiar with AX, the report is prossed on the AOS server (another server) and this server talks with the DB server. A lot of SQL statements are executed on the DB server and the results sent back to the AOS server which eventually will send the report back to the client. I started suspecting the the DB server is not being fully utilised due to the low CPU utilisation and since the batch request/sec is low, it's not being hit hard enough by the AOS server. For a Gbit NIC, what network utilisation should I expect? Is there any guideline?RegardsBrian

how to create a virtual view from history records?

Posted: 02 May 2013 06:20 AM PDT

Hello - I'm working on a database that has an IndividualAddress table.When an address in the IndividualAddress table is updated the system inserts a new row into the table. For example:IndivdualId City State Zip LastModied----------- ----- ----- ------ -----------12345 Dallas TX 75204 01/01/201012345 Los Angeles CA 90069 02/01/2011All the individual addresses and address changes are stored in this table in this way.So how can I find all individuals who currently live in TX by querying this table?If I just search where state = 'TX' then it will pick up the individual in my example above even though he moved to LA, CA in 02/01/2011?

Need help rebuilding a SQL server with replicated databases

Posted: 02 May 2013 12:09 AM PDT

We have a production SQL server in a virtualized environment and we need to move it to a physical server to address performance issues with an application that accesses the database.The server has 3 user databases 2 of which are replicated. We are using transactional replication (one way) and the distributor is on a different server. We use the replicated copies of the database for reporting, and the published databases are currently approaching 200G in size. We have mad some changes to replication to make reporting easier (We convert xml stored in varchar(max) columns to xml during replication, and have persisted calculated columns added on the subscriber. Overall, we would expect to spend a couple days rebuilding replication if it comes down to that. We would prefer to be able to simply replace the publisher with the new physical server and continue on.Does anyone have any suggestions for a process that would allow us to do that, or will we just need to bite the bullet and rebuild replication after the publisher is recreated?thanks, Kevin

Data warehouse project

Posted: 02 May 2013 05:17 AM PDT

Boss assigned me to work as data warehouse manager.I am new. Where to start? Please recommend me some links and tools.

Primary and Secondary database model

Posted: 02 May 2013 06:54 AM PDT

Our current application is running on asp.net and sql server 2008. Due to business decision, the application will be using a new database as of July 1st, basically it is a copy of the old database but no data restored from the previous database. However, the application may still need to contact the old database for few months.The one option would be setting up two different systems one will be pointing to the old and the other one will be pointing to the new db. So, we can flip the switch to the new db on July 1st. No issues.I am wondering, is there any feature in SQL server two DB can be used as primary (new DB) and secondary (old DB) model so our operations first goes to primary DB else secondary without changing any code in SP / application.

Using SQL Management Query in Visual Studio

Posted: 02 May 2013 02:42 AM PDT

I am trying to use a query that works in SQL Management in Visual Studio but the code it not translating. Can any one help me fix this. Here is the code in SQL Management.DECLARE @StartDate DATETIME = '04/01/2013'DECLARE @EndDate DATETIME = '04/30/2013 23:59:59'DECLARE @UTCTimeDiff INT = -1 * (SELECT DATEDIFF(hh, GETDATE(), GETUTCDATE()))SELECT CASE WHEN MAX(TransactionDate) IS NOT NULL THEN CONVERT(VARCHAR, DATEADD(hh, @UTCTimeDiff, MAX(TransactionDate))) ELSE 'No Data Present' END AS 'Data Freshness' FROM TotalInvoicing..TI_TransactionReporting (nolock)I believe it has something to do with the Declare Statements but I am unsure how to remedy this error. Any help is greatly appreciated. Thank you!-Molly

Various connections for the same session

Posted: 02 May 2013 02:46 AM PDT

Hello:I'm working with SQL Server 2008 R2 and launching sys.dm_exec_connections and sys.dm_exec_sessions views I have found something strange: various connections for the same session.I have copied a piece of the result to an attached .text file with 3 connections sharing a session.Could someone give some explanation for this issue?Could this be related to using multiple threads in a client application? Thanks in advance,Ignacio

can i set an update only if SP is successful?

Posted: 02 May 2013 03:05 AM PDT

Hi,I have a SP which updates a flag before the select statement for a report in SSRSIf for some reason the job dies in the middle I don't want the update to happenCan I control it so it happens at the end Thanks

How could i find the client?

Posted: 02 May 2013 02:01 AM PDT

I have a consolidated SQL Server Machine, and none of application are running on this. But I have an error like below:[i][b]The server will drop the connection, because the client driver has sent multiple requests while the session is in single-user mode. This error occurs when a client sends a request to reset the connection while there are batches still running in the session, or when the client sends a request while the session is resetting a connection. Please contact the client driver vendor.[/b][/i]How can i find the client driver? With Which events of Server Side Trace or Profiler trace?

Peer-to-peer replication

Posted: 30 Apr 2013 01:23 AM PDT

Question. if I insert a row on one server in the peer-to-peer topology, is it possible that an update is attempted on that same row on another server before that insert has been replicated over? Meaning is it synchronous or asynchronous?

Stored procedure not giving desired output

Posted: 02 May 2013 12:23 AM PDT

I am getting my data of current year even after adding range date. i tried everything but not found way to understand what i done wrong.[Code] PROCEDURE [dbo].[procClosingStock] -- Add the parameters for the stored procedure here ( @ITEM_ID1 int --@PreviousYear date, --@currentYear date ) ASBEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON;create table #TempClosingStock ( Closing_Soh int, item_id int, created_date date ) ( select SUM(stock_quantity),item_id,Created_Date from [Stock_Header] where Stock_Type='stock in' and item_id=@item_id1 group by Item_ID,Created_Date ) update #TempClosingStock set Closing_Soh=Closing_Soh-ISNULL((select SUM(stock_quantity) from [Stock_Header] where Stock_Type='stock out' and item_id=@item_id1 group by Item_ID),0) -- Insert statements for procedure here SELECT Closing_Soh,item_id,created_date from #TempClosingStock where item_id=@item_id1 and created_date between '2012-04-1T00:00:00.000' AND '2013-01-3T00:00:00.000' drop table #TempClosingStockEND[/code]

Tricks to check if a Restored Database has SQL Replication remnants

Posted: 02 May 2013 12:56 AM PDT

We have Clients that will restore a Production Database into a Development/Testing Server. One of the options that we have configured is SQL Replication and because of that there are frequently issues because the Publishers are not dropped after the restore has been completed.We have done our best to automate this process by providing instructions and script and the like, but like many things some of the instructions are not followed and problems occur.To my question, does anyone know of a way to check if a database is configured for replication via T-SQL? I know how to drop replication for a database by using the [code="sql"]sp_removedbreplication[/code] command, but I'd like to make it dynamic if possible so that checking can be completed before just blanket issuing the command.Any and all comments and stuff is appreciated.

SQL migration

Posted: 05 Mar 2013 11:01 PM PST

Whats the best way to copy logins, users, logins with password along with exact status of enable/disable from source 2008 to destination server 2008

How SQL Server inserts record into CDC table (with Latency)

Posted: 01 May 2013 09:00 PM PDT

Hi All,I have a question related to CDC latency in SQL Server 2008 R2.Suppose I have two tables T1 and T2 with the following schema:T1:ID(int), Name(varchar), InsertionDate(datetime), Rowversion(timestamp)T2:ID(int), Name(varchar), T1ID(int), InsertionDate(datetime), Rowversion(timestamp)CDC is enabled on both the above tables. Now I have inserted one record in each table (with assumption that 5 mins latency in CDC is there):T1:ID, Name, InsertionDate, Rowversion1, 'test1', '2013-05-02 01:00:00', 0x0001T2:ID, Name, T1ID, InsertionDate, Rowversion100, 'testt1', 1, '2013-05-02 01:00:30', 0x0020(Note: In table T1 the record is first and after 30 secs the record is inserted into T2)As we know that CDC picks the data from transaction logs, so my question is it possible that CDC commits the data into T2_CT first and then T1_CT??

SQL 2k5 to 2k8 - Migrate Logins

Posted: 01 May 2013 08:58 PM PDT

Migraion of SQL 2005 to SQL 2008 Am using sp_help_revogin to migrate all login. For most user db's, user-db mapping got copied. FOr system db's, it doesnt copy. How to achieve this?stored procedure is given http://support.microsoft.com/kb/918992

No comments:

Post a Comment

Search This Blog