Tuesday, March 19, 2013

[how to] clusterdb stopped working for PostgreSQL database in Ubuntu 12.04 server

[how to] clusterdb stopped working for PostgreSQL database in Ubuntu 12.04 server


clusterdb stopped working for PostgreSQL database in Ubuntu 12.04 server

Posted: 19 Mar 2013 09:02 PM PDT

After a fresh PostgreSQL install (version 9.2) on my Ubuntu server 12.04, clusterdb utility stopped working. I used the same config files as I used with version 9.1.

Below is the actual command:

clusterdb -v -h localhost -p <port#> -U postgres -d <dbname>  

And this is the response that I'm getting:

clusterdb: could not connect to database <dbname>: could not connect to server: Connection refused      Is the server running on host "localhost" (127.0.0.1) and accepting      TCP/IP connections on port <port#>?  

I'm able to connect to the server locally and from outside. I can also use another utilities such as pg_dump with no problems.

My pg_hba.conf file below:

# Database administrative login by Unix domain socket  local all postgres peer  # TYPE DATABASE USER ADDRESS METHOD "local" is for Unix domain socket connections only  local all all peer  # IPv4 local connections:  host all all 127.0.0.1/32 md5  # IPv6 local connections:  host all all ::1/128 md5  # Allow replication connections from localhost, by a user with the replication privilege.  local all postgres md5  host all all x.x.x.x/x md5         #my network  host    all all  x.x.x.x/x  md5    #internal ip of the server  

Are there any security changes were implemented in PostgreSQL 9.2 vs 9.1 that may prevent clusterdb to work properly?

Appropriate Action for Dead Locks / Blocking

Posted: 19 Mar 2013 08:00 PM PDT

I've setup my databases so that anytime a procedure is blocked for more than 45 seconds the database notifies the DBA email. Is it bad practice to setup a way to auto kill the process that is doing the blocking? I'm assuming yes; however, waiting until an off hours DBA can get to a computer and fix the change seams problematic as well. Is there a better way to handle the blocking processes?

SQL Server 2008 R2 Restore COPY_ONLY full backup with transaction logs

Posted: 19 Mar 2013 07:11 PM PDT

After doing some research I cannot seem to find an answer to this question.

Background I am attempting to setup a backup plan that fits the following three requirements: 1) Reliability of backups, having nightly full backups 2) Transaction log backups that could be restored from 3) Low amount of disc space used 4) The backups must be accessible locally for an auditing tool

So to fit those needs I am thinking full backups weekly, differential daily, and transaction hourly. Then each night a copy_only backup would run that can be shipped offsite, this backup is done so that the log chain is not broken, and we have reliable nightly full backups offsite, without having to eat up so much local disc space.

Question Is it possible to restore from the copy_only backup, and restore the transaction logs after.

Let me just give an example so you know what I am talking about.

Using the below list I am wondering if it is possible to restore FullbackupCOPY_ONLYC.bak followed by TransactionbackupG.trn, TransactionbackupH.trn, finally TransactionbackupI.trn

---List of Backups--- FullbackupA.bak 01/01/2013 00:00:00 DifferntialbackupA.bak 02/01/2013 00:00:00 FullbackupCOPY_ONLYA.bak 02/01/2013 00:00:00 TransactionbackupA.trn 02/01/2013 01:00:00 TransactionbackupB.trn 02/01/2013 02:00:00 TransactionbackupC.trn 02/01/2013 03:00:00 DifferntialbackupB.bak 03/01/2013 00:00:00 FullbackupCOPY_ONLYB.bak 03/01/2013 00:00:00 TransactionbackupD.trn 03/01/2013 01:00:00 TransactionbackupE.trn 03/01/2013 02:00:00 TransactionbackupF.trn 03/01/2013 03:00:00 DifferntialbackupC.bak 04/01/2013 00:00:00 FullbackupCOPY_ONLYC.bak 04/01/2013 00:00:00 TransactionbackupG.trn 04/01/2013 01:00:00 TransactionbackupH.trn 04/01/2013 02:00:00 TransactionbackupI.trn 04/01/2013 03:00:00

Maybe this whole setup is not reasonable I am fairly new to SQL Server and am trying to learn as I go. Any advice/help would be appreciated.

What settings should I use in my.cnf

Posted: 19 Mar 2013 09:01 PM PDT

We have about 5-6 different internal webapps (staff use only) that we will be transferring across to a new virtual server, however I need to setup our my.cnf settings first and I am lost (I am web designer - so this is a little out of my league!).

Our internal systems have about 100 tables apiece, using primarily InnoDB but in some cases using Myisam. Also, 1 of our webapps has quite a lot of data (100,000 records+) and the SQLs used to select the data are quite inefficient (few indexes used, inner SELECTs etc). Roughly about 50 users at any one time.

Our new server will be the HS-250 from HeartInternet.

Many thanks for the help,

Cant start SQL services

Posted: 19 Mar 2013 04:03 PM PDT

So I am having issues starting my SQL services. Currently it is failing the start due to:

FileMgr::StartLogFiles: Operating system error 2(The system cannot find the file specified.) occurred while creating or opening file 'E:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\modellog.ldf'. Diagnose and correct the operating system error, and retry the operation.

The issue herein lies in the fact that this .ldf file is not on the E drive. There is no E: drive. The file is located on the F: drive.

I cannot find anywhere to tell SQL to change this. I can not drop/move/detach/etc the databases, because I cant even get the service to start, in order to connect.

This is happening on 3 different instances of SQL, after volume moves. Any ideas?

SQL 2008

SQL Server 2008 Restore

Posted: 19 Mar 2013 06:49 PM PDT

I'm trying to restore from a database backup to SQL Server 2008. The backup came from a different server. I'm executing the following query:

RESTORE DATABASE Database_Name  FROM DISK = 'C:\Path\To\Backup.file'  WITH REPLACE  GO  

If the database does not exist, I'm getting the following error:

Msg 3118, Level 16, State 1, Line 1  The database "Database_Name" does not exist. RESTORE can only create a database when restoring either a full backup or a file backup of the primary file.  Msg 3013, Level 16, State 1, Line 1  RESTORE DATABASE is terminating abnormally.  

The file is a full backup.

If the database does exist, I get the following error:

Msg 3154, Level 16, State 4, Line 1  The backup set holds a backup of a database other than the existing 'Database_Name' database.  Msg 3013, Level 16, State 1, Line 1  RESTORE DATABASE is terminating abnormally.  

From what I understand, WITH REPLACE is supposed to be a workaround for this.

I'm stumped.

Is it ever harmful to set a database owner in SQL Server?

Posted: 19 Mar 2013 04:32 PM PDT

There are a bunch of databases on one of our SQL servers that have no owner. Generally speaking, is it harmful to give them one? For example:

USE dbName;  GO    EXEC sp_changedbowner 'sa';  GO  

I know sa may not be the best choice, but it is just an example. My primary concern is whether or not creating an owner when there is none may cause issues with software that can currently connect OK.

Custom sp_who/sp_whoUsers

Posted: 19 Mar 2013 02:50 PM PDT

I need to allow a client in a dev DW SQL 2K8R2 environment, to view and kill processes, but I do not want to grant VIEW SERVER STATE to this person (he's a former sql dba and is considered a potential internal threat).

When I run the following, it returns one row as if the user ran the sp themselves with their current permissions.

USE [master]  GO    SET ANSI_NULLS ON  GO  SET QUOTED_IDENTIFIER ON  GO    CREATE PROCEDURE [dbo].[usp_who] with execute as owner  AS  BEGIN      SET NOCOUNT ON;      exec master.dbo.sp_who;  END  

Changing the "with execute as" to "self" (I'm a sysadmin) returns the same results. I've also tried the below instead of calling sp_who, and it only returns one row.

select * from sysprocesses  

It seems that the context isn't switching, or persisting, throughout the execution of the procedure. And this is to say nothing of how I'm going to allow this person to "kill" processes.

Does anyone have a solution or some suggestions to this seemly unique problem?

sysadmin cannot drop login when ddl audit trigger enabled

Posted: 19 Mar 2013 12:19 PM PDT

I have a DDL trigger defined for database- and server-level events in SQL Server 2008R2 (Standard) which exists to log all DDL events into an audit table in an admin database. It's only function is to extract the relevant data from EVENTDATA() and insert into this table. For the purpose of the insert (as only sysadmins can access the admin database), I have created a dedicated SQL login with only INSERT permission to this table, and granted IMPERSONATE on this login to public. This is intended to prevent permissions-related errors from the trigger firing and attempting to insert into the audit table, when the caller does not have the necessary access to the database/table.

Here is the trigger definition:

CREATE TRIGGER [ddl_log]   ON ALL SERVER   FOR DDL_DATABASE_LEVEL_EVENTS, DDL_SERVER_LEVEL_EVENTS   AS  BEGIN    DECLARE @data XML  SET @data = EVENTDATA()     EXECUTE AS LOGIN='<dummy login>'      INSERT admin.dbo.ddl_audit (PostTime, DB_User, [Event], [TSQL], Host, DatabaseName)      VALUES (         GETDATE(),         CONVERT(NVARCHAR(100), ORIGINAL_LOGIN()),         @data.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(100)'),          @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(2000)'),          CONVERT(NVARCHAR(100), HOST_NAME()),         @data.value('(/EVENT_INSTANCE/DatabaseName)[1]','nvarchar(100)')         ) ;  REVERT   END  

I have had no issues with this trigger since implemented months ago. However, now it appears to be preventing even a sysadmin from executing an ALTER LOGIN, DROP LOGIN, etc. under certain circumstances as follows:

My environment also includes MS Dynamics GP 2010 (Great Plains). Great Plains allows an admin to manage users, and for each new Great Plains user, the software creates a SQL login for that user in SQL Server. Resetting a password in the Great Plains interface resets the SQL password. And so forth...

However, even if logged into Great Plains as 'sa' as long as the above trigger is enabled any attempt to alter or drop a login fails with error 15151 (Cannot alter the login 'loginname', because it does not exist or you do not have permission). If I disable the trigger, everything works normally.

The same operations executed in SSMS, or through some other interface, are successful, even for non-sysadmins who have some level of DDL permissions. It only fails when performed in Great Plains. A profiler trace of the operation shows that GP is merely submitting a standard T-SQL 'ALTER LOGIN' or 'DROP LOGIN' statement, and that the statement correctly shows as called by the sa account. It does not appear that the session ever switched to a different context, other than for the insert into the audit table (which it never actually got to, as no record was logged for the statement). And just in case the session somehow was maintaining the wrong context after that impersonation, I tried making the dummy-insert login a sysadmin with no success.

My question is, are there certain combinations of SET options/connection settings, etc. that could result in this type of behavior or issues with DDL trigger execution that could prevent a sysadmin from performing certain operations? Or is there some other avenue to investigate that I am completely missing?

Create Temp Table From Stored Procedure Result

Posted: 19 Mar 2013 12:23 PM PDT

Is it possible to create a temp table from a stored procedure result set without having to specify column datatypes? If not, is it possible to create a physical table at least from the SP? The reason I am asking is because I have an SP that returns too many columns for me to declare in a create table command.

Is there something like

create table from

where columns do not need to specified and sql automaticly determines column attributes??

I am running SQL server 2008

trap xp_logininfo stored procedure error

Posted: 19 Mar 2013 12:03 PM PDT

Sql server 2000, windows 2003 server.

I am trying find all the invalid domain login creates in sql 2000 server in a script using xp_logininfo and get message "Could not obtain information about Windows NT group/user" How can I trap this error in T-sql scripts and continue. Or is there any other way to valid domain id with AD group.

Thanks

Pan

Load sybase .db into new sybase developer server

Posted: 19 Mar 2013 02:06 PM PDT

We've got a few .db files laying around here which were created by Sybase (version 8.x, I guess) and I've the task to extract some data from those old databases.

Now I've installed the latest Sybase developer version (15.7) and want to add these databases to the sybase server.

Unfortunately I do not have any experience with sybase and I couldn't find any documentation in the official sybase manual for this task.

Is this even possible? Does anyone of you know about an article describing the integration of existing complete databases or the necessary commands for the task?

INNODB real-time locks?

Posted: 19 Mar 2013 11:46 AM PDT

How can I view real-time locks in innodb when information_schema.innodb_locks is not available?

Investigating high SQL waits, especically LCK_M_U

Posted: 19 Mar 2013 01:26 PM PDT

Recently I have been struggling with SQL server performance and although we have fixed a huge multitude of basic errors in the config it is still performing badly. To clarify it is not overall performance but rather fairly frequent time outs (from the client application). Previously I had looked to the memory as a cause but this has now been resolved and we are still getting the same behaviour.

Looking at the graphs from Management Data Warehouse I can see that LCK_M_U/X/IX are causing the majority of our waits around the time a user experiences a timeout. Everything I am reading states I need to look at the queries and processes running but I have yet to find anything aimed at a level I can understand. The locks, as you can see in the picture below, seem to spike which coincides with the error on the users side. Is there a clever DMV or some such that I can address to try and work out what query is being run that is creating the lock? Is it a case of trawling through a trace to find the details? Any guidance greatly appreciated and apologies if the information is not clear.

enter image description here

Update strings for all cmdexec Jobs

Posted: 19 Mar 2013 11:47 AM PDT

Does anyone know if it's possible, (and furthermore advisable) to update all cmdexec strings in jobs on a server?

We'd like to change our logging location, and in principle I'd think this could be done very simply by changing the /O "\\LOCATION\file.log" strings.

I can see how you'd do that in fact

USE MSDB  GO  update dbo.sysjobsteps  set [command] = replace([command], '\\OLDLOCATION\... ', '\\NEWLOCATION\... ')  WHERE...  

BUT I've had it drummed into me that you should never update the sys tables manually, so the idea of doing this makes me beyond nervous. For anything else I'd use the dbo.sp_update_job proc (http://msdn.microsoft.com/en-gb/library/ms188745.aspx for those who've strayed here looking for something else job related), but that procedure doesn't seem to be able to handle this.

I have a hunch that this might possibly be an exception to the "Never Update the Sys Tables" rule though? I can't imagine that the cmdexec string is implied in any further relations in the way, for example, enabled/disabled status evidently is. Wishful thinking?

So I don't know for sure, and I can't seem to find out. Anyone know or have any experience with this?

Thanks Dan

PostgreSQL 9.2: vacuum returning disk space to operating system

Posted: 19 Mar 2013 03:11 PM PDT

Vacuum usually does not return disk space to operating system, except some special cases. From the docs:

The standard form of VACUUM removes dead row versions in tables and indexes and marks the space available for future reuse. However, it will not return the space to the operating system, except in the special case where one or more pages at the end of a table become entirely free and an exclusive table lock can be easily obtained. In contrast, VACUUM FULL actively compacts tables by writing a complete new version of the table file with no dead space. This minimizes the size of the table, but can take a long time. It also requires extra disk space for the new copy of the table, until the operation completes.

The question is: how can this database state when one or more pages at the end of a table become entirely free be achieved? This can be done via VACUUM FULL, but I haven't got enough space to implement it. So are there any other possibilities?

In what data type should I store an email address in database?

Posted: 19 Mar 2013 12:20 PM PDT

I understand that an 254 character email address is valid, but implementations I have researched tend to use a varchar(60) to varchar(80) or equivalent. For example: this SQL Server recommendation uses varchar(80)

Is there a reason to not use the full 254 character maximum? Doesn't a varchar by definition use only as much storage as needed to hold the data?

Are there significant performance implications/trade-offs which cause so many implementations to use less than the full 254 possible characters?

SQL Server in a state of confusion: Does the database exist or doesn't it?

Posted: 19 Mar 2013 01:54 PM PDT

Got a really weird, annoying problem.. Somehow the instance of SQL Server 2008 R2 running on our server has gotten somewhat corrupted.

First, we noticed that the database we created yesterday was missing. So, we looked around and found that it was still there, but detached. So, we tried to attach the mdf but got a message which was something like The file is currently in use.

I thought that was odd, so restarted SQL Server. Same thing... okay, time for drastic measures... so, I stopped the service, zipped up the mdf, started the service, unzipped it and then tried to restore it. The above message was gone, but then I got:

Cannot attach a database with the same name as an existing database  

Ouch. Of course it's not showing in the database explorer, so no idea what's going on... last resort:

DROP DATABASE [DatabaseName]  

Of course that didn't work.. that tells me the database does not exist. So, I'm stuck... at one point MS SQL thinks the db does exist and at another point it thinks the db does not exist.. obviously it's in a state of confusion.

Has anyone seen this before? Got any ideas on how to fix it?

Retrieving all PK and FK with pgadmin

Posted: 19 Mar 2013 04:11 PM PDT

I have a big database that I need to extract all primary keys and foreign keys from each table.

I have pgAdmin III.

Is there a way to do this automatically and not go over each table manually?

Overview of how MongoDB uses its various threads

Posted: 19 Mar 2013 08:07 PM PDT

On one instance I have MongoDB using ~85 threads. In lieu of having time to investigate directly, I am curious:

  1. What are the threads used for? Do they fall into a few high-level usage categories?
  2. How can I control/influence the number of threads used? E.g. max connection params, etc.

What are the differences between leaf and non-leaf pages?

Posted: 19 Mar 2013 11:36 AM PDT

I've been running some index usage reports, and I'm trying to get a definition of Leaf and Non-leaf. There seem to be both Leaf and Non-leaf Inserts, Updates, Deletes, Page Merges, and Page Allocations. I really don't know what it means, or if one is better than the other.

If someone could give a simple definition of each, and also explain why Leaf or Non-leaf matters, it would be appreciated!

Ms-Access 2007-2010 SQL - from single row record how to create multiple rows

Posted: 19 Mar 2013 12:51 PM PDT

I need help in below table updation using SQL in Ms-Access 2007-2010 database.

I've a table called table1 which has below entries:

  table1:  --------------------------------------------------  |  col1  |   col2   |          col3         |  -------------------------------------------------  |    A1  |  abcd    |      123;456;789      |  -------------------------------------------------  |    B2  |  efgh    |       246;369         |  --------------------------------------------------  |    C3  |   ijk    |                       |  --------------------------------------------------   

Using SQL, I want to change the above table (particularly the column col3 multiple values into individual multiple rows) as shown below:

  --------------------------------------------------  |  col1  |   col2   |          col3       |  -------------------------------------------------  |    A1  |  abcd    |         123         |  -------------------------------------------------  |    A1  |  abcd    |         456         |  -------------------------------------------------  |    A1  |  abcd    |         789         |  -------------------------------------------------  |    B2  |  efgh    |         246         |  --------------------------------------------------  |    B2  |  efgh    |         369         |  -------------------------------------------------  |    C3  |   ijk    |                     |  --------------------------------------------------  

How do I do it? I tried various forums. but could not come up with exact solution so far.

Appreciate your help.

Thanks, Jak.

Stress test to saturate memory for MySQL innodb

Posted: 19 Mar 2013 01:30 PM PDT

I'm investigating windows 32bit memory problems (the 2G limit) and am after a test which will max out mysql's memory, the innodb_buffer_pool, per connection memory and/or any other uses of memory.

Perhaps a query I could use for mysqlslap?

How to search whole MySQL database for a particular string

Posted: 19 Mar 2013 11:03 AM PDT

is it possible to search a whole database tables ( row and column) to find out a particular string.

I am having a Database named A with about 35 tables,i need to search for the string named "hello" and i dont know on which table this string is saved.Is it possible?

Using MySQL

i am a linux admin and i am not familiar with databases,it would be really helpful if u can explain the query also.

Splitting TempDB into multiples files equal to number of CPUs

Posted: 19 Mar 2013 04:45 PM PDT

The article SQL Server tempdb Best Practices Increase Performance suggests that I should split tempdb into a number of files equal to the number of cores. So for 4 cores you get 4 files.

By having the larger number of files, you can increase the number of physical I/O operations that SQL Server can push to the disk at any one time. The more I/O that SQL Server can push down to the disk level, the faster the database will run. With standard databases, SQL Server can cache a large amount of the data that it needs into memory. Because of the high-write nature of the tempdb, the data needs to be written to the disk before it can be cached back up into memory.

Though it sounds good in theory, is it really that good as a general optimisation? Is it something that may only apply for specific systems where IO is very high?

multivalued weak key in ER database modeling

Posted: 19 Mar 2013 11:44 AM PDT

I was wondering since i didnt find out any clarification for this. I want to store movies that exist in different formats (dvd, bluray etc) and the price for each format differs from each other as well as the quantity of each format, so i came up with this:

example

Is this correct from a design perspective? Does this implies redundancy? I dont understand how will this be stored in a table. Would it be better to do it like this :

enter image description here

Thanks in advance.

EDIT : I add some more descriptive information about what i want to store in this point of the design. I want to store information about sales. Each movie that exist in the company i need to store format, price and stock quantity. I will also need to store customer information with a unique id, name, surname, address, movies that he/she has already bought and his credit card number. Finally i will have a basket that temporary keeps items (lets suppose that other items exist apart from movies) that the customer wants to buy.

Microsoft Office Access database engine could not find the object 'tableName'

Posted: 19 Mar 2013 02:44 PM PDT

First a little background: I am using MS access to link to tables in an advantage database. I created a System DSN. In the past in Access I've created a new database, and using the exteranl data wizard, successfully linked to tables. Those databases and the linked tables are working fine.

Now I am trying to do the same thing, create a new access db, and link to this same DSN. I get as far as seeing the tables, but after making my selection, I get the error, " The Microsoft Office Access database engine could not find the object 'tableSelected'. Make sure the object exists and that you spell its name and the path name correctly.

I've tried creating another datasource (system and user) with no luck. Environment is Wn XP, Access 2007, Advantage DB 8.1

[MS SQL Server] What are the chances to went a DB offline or Suspect mode?

[MS SQL Server] What are the chances to went a DB offline or Suspect mode?


What are the chances to went a DB offline or Suspect mode?

Posted: 19 Mar 2013 05:17 AM PDT

Hi Frineds,What are the chances to get a DB offline or Suspect mode?give me some examples pleaseThank you in Advance :-)

SSIS ForEach Loop Container - loop through all DB's

Posted: 15 Feb 2013 06:27 AM PST

I am trying to get the FELC to loop through all the databases for my sql server. I want to set up a variable by database_id from the sys.databases table so the FELC knows to loop through all the databases. Has anyone wrote this before or have a different approach that I can try?

Do not log job history

Posted: 19 Mar 2013 12:03 AM PDT

Hi,My SQL Server instances have some jobs that run very frequently ( some execute at every 30 seconds ) and these jobs are consuming all my job history log size and I am losing the history of my main jobs.I already did some research and found how to increase this log size, but the best would be if I could say to not log some jobs' output. I did some some research, but I did not find anything about it.Does anyone know if this is possible?Thanks!

sql not finishing or stuck

Posted: 18 Mar 2013 09:40 PM PDT

I'm on SQL 2008 10.0.4371 with 16 cpus and 24 gigs of RAM. Under some condition, I start getting increasing CPU loads (until about 90% utilization on all 16 cpus) and a website that stops responding that never disappear until I restart SQL Service. During this time, the only thing that seems to jump out is 4 million reads per second. I tried running sp_whoisactive, but nothing seems to be jumping out. Other than everything is waiting on SQL to send the results. The app servers are WebSphere so we're using SQL JDBC along with (eeek) cursors. There are a couple queries that run slow and missing indexes, which I've submitted to IBM. I have four temp dbs (100 meg and three 200 meg--which I want to set all to 100) one on D: the rest on E:. Of course yesterday I noticed the c, d and e drives are all on the same lun. So, I'm think maybe going down to 1 or 2 tempdb instead. The e drive tempdbs have a read stall of 28 seconds while the d is around 8 or less. Also, the other thing is I'm thinking of turning on ad hoc optimization as with the cursors there's no stored procs (ugh). I did notice that SQL was only using 4 gig of ram, and have fixed that.Is there a switch I can turn on to end a command if it does not finish in a certain amount of time? Is that the governor?

Loading Users into instance security

Posted: 19 Mar 2013 12:57 AM PDT

We are moving from a server with SQL server 2005 to one with SQL server 2008. Is there any way to script a bulk load of users into the instance security? I have mounted the databases, from backup, but need to get the users set-up in the instance. With over 1,000 users this would take a while going through the SSMS GUI interface.

Delayed Execute and/or Commit

Posted: 19 Mar 2013 02:30 AM PDT

I have a scheduled job (.Net, WinOS Task Scheduler) which completes shortly after it's called but the SQL work it should accomplish in real time does not happen until several hours later. The .Net scheduled job calls a stored procedure which updates 2 "top level" tables and outputs 3 data readers (that's what they are to the .Net program). Both updated tables have 2 update triggers each (one trigger inserts to a log table & one manages some computed fields in "quick search" tables). One of the fields updated in each table is a datetime field - set to getdate() within the sproc. Temp tables are used to identify the rows to be updated. Quick pseudo-code (showing just one update and select statement):----declare @mydate dateimeset @mydate = getdate()declare @UpdateIDs table(MyID int)insert @UpdateIDsselect CaseID from TopLevelTable1where ...begin transaction update TopLevelTable1 set Flag1 = [new value] , Flag1Date = @mydate where CaseID in (select MyID from @UpdateIDs) -- 2 triggers are fired in conjunction with this statementcommit transactionselect [several fields]from @UpdateIDs tinner join [several joins from temp table]----The problem: the .Net program runs at 11p each night and completes in about 20 seconds. It sends out an email for each row in the select and one to me regarding the ending status. All those emails happen at 11p. The job reports success.But the time stamp (the param @mydate & the field TopLevelTable1.Flag1Date) is 6am the next day. The trigger inserting to TopLevelTable1_Log has a field LogDate which is set to getdate() within the trigger and its value is also 6am.There is no other maintenance work being done at that time. Transaction logs (using sqllogship.exe) are being done and full database backup is done at 1:45am. That full backup has the unmodified rows relating to the this job.What could be happening?

Sql server back ground processes.

Posted: 19 Mar 2013 02:25 AM PDT

I want to know , when i am execute sql select query , what are processes will be running in backgroundPlease

SP3 upgrade on SQL 2008 issue

Posted: 18 Mar 2013 06:11 PM PDT

Hi,Recently we tried upgrading SQL 2008 cluster instance from SP2 to SP3. After patching done on passive node, we tried connecting to SQL instance through SSMS, but got the error "Login failed for user. Server is in script upgrade mode. Only administrator can connect at this time".Can someone please let me know the reason and its solution?Regards.

[Articles] Serious Storage

[Articles] Serious Storage


Serious Storage

Posted: 18 Mar 2013 11:00 PM PDT

Steve Jones talks about the serious storage that EMC is bringing to the Vatican library.

[SQL 2012] Reports Designed in SQL 2005

[SQL 2012] Reports Designed in SQL 2005


Reports Designed in SQL 2005

Posted: 19 Mar 2013 01:24 AM PDT

Hi. I have installed SQL 2012 into a dev environment to allow me to test ERP System reports. Using Visual Studio 2010 I have attempted to execute a standard sales report. This returns a rendering error "Index was out of range. Must be non negative and less than the size of the collection. Parameter name index".The report in question was developed for SQL 2005 and has hidden fields which were added in order to display header information on multiple pages. I understand this fudge is no longer required since SQL 2008.So my question. Where to start! I was hoping I would be able to push a magic button and update the format of older reports so they execute.Any tips advice welcome, I have hundreds of reports. Some execute but I am sure many will not.Kind Regards,Phil.

Can't execute SSIS package from remote machine

Posted: 18 Mar 2013 09:26 PM PDT

OS: Windows 2012, standard, 64-bitSQL: 2012 + SP1 + CU2 = 11.0.3339We recently decided to "break apart" our BI environment. We used to have everything on one box, DB Engine, SSIS, SSAS & SSRS. Everything has been running fine, but we now have other projects using these services, so we decided to break them apart into their own boxes.We now have DB Engine on one Server, SSIS & SSRS on another server and SSAS on yet another server, so we now have three boxes that replaced one box. All are Windows 2012, standard, 64-bit with SQL Server 2012 + SP1 + CU2.Since some of our SSIS packages have to access external resources, we used a domain account for it's service account. The DB Engine and SSAS box are using the default service accounts when installed. I can execute the packages fine on the SSIS server, I can even execute them via SQL Agent jobs on the SSIS box (we did install a default instance of SQL on the SSIS box), however when I try to execute a package from my laptop, it fails with the ugly "Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'". I immediately double checked my SPNs and they all looked correct for the SSIS server and the service account we are using (and we had no duplicates). I also double checked the User Rights Assignment in the Local policy editor and all the correct Rights have been assigned (Log on as a service, Bypass traverse checking, Impersonate a client after authentication). I'm stumped here. Anybody have anything else I can check or that I have overlooked?Thanks-A.

Connect button Grayed Out

Posted: 15 Jun 2012 01:43 AM PDT

I'm just starting to play with the 2012 version of SSMS, I got the 2012 Express with SSMS installer and installed an instance of 2012 on my laptop. I downloaded the 2012 Adventure works DB from codeplex and ran this query to install it:CREATE DATABASE AdventureWorks2012 ON (FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2012\MSSQL\DATA\AdventureWorks2012_Data.mdf') FOR ATTACH_REBUILD_LOG;This generated a level 20 error:Msg 948, Level 20, State 1, Line 1The database 'AdventureWorks2012' cannot be opened because it is version 706. This server supports version 661 and earlier. A downgrade path is not supported.Uh oh, looks like I connected to my default 2008R2 instance, not my new 2012 instance.I went to press the connect button next to the database drop down list, I noticed that both connect and disconnect are grayed out.I know that a Level 20 error will kill your connection, but usually the connect icon becomes available. I was able to get around this by just executing a "USE Master" command to reconnect, then disconnect and connect to the proper instance, but I found this behavior odd.I couldn't find any info on this via Google or searching the forums here. Anyone else noticed this? To verify that this is caused by an error, I ran the following and it too grayed out both connect and disconnect:RAISERROR (N'This is message %s %d.', -- Message text. 20, -- Severity, 1, -- State, N'Error', -- First argument. 5) WITH log; -- Second argument.

[T-SQL] Help with the query!

[T-SQL] Help with the query!


Help with the query!

Posted: 18 Mar 2013 10:20 PM PDT

I came across a situation as follow[code="sql"]CREATE TABLE DAYOFWEEK(WeekID INT IDENTITY,DAY NVARCHAR(50))CREATE TABLE SubjectToStudy(WeekID INT,SUBJECT NVARCHAR(50))INSERT INTO DAYOFWEEK VALUES('MONDAY')INSERT INTO DAYOFWEEK VALUES('TUESDAY')INSERT INTO DAYOFWEEK VALUES('WEDNESDAY')INSERT INTO SubjectToStudy VALUES(1,'PHYSICS')INSERT INTO SubjectToStudy VALUES(1,'CHEMISTRY')INSERT INTO SubjectToStudy VALUES(1,'MATHEMATICS')INSERT INTO SubjectToStudy VALUES(2,'COMPUTERSCIENCE')INSERT INTO SubjectToStudy VALUES(2,'BIOLOGY')INSERT INTO SubjectToStudy VALUES(3,'BOTANY')INSERT INTO SubjectToStudy VALUES(3,'PHYSICS')[/code]I want the output as MONDAY PHYSICS CHEMISTRY MATHEMATICSTUESDAY COMPUTERSCIENCE BIOLOGY WEDNESDAY BOTANY PHYSICSI tried using Pivot table, but was not able come up with the above output. Could you guys please help!!!

The FOR XML clause is invalid in views, inline functions, derived tables, and subqueries when they contain a set operator.

Posted: 29 Apr 2011 02:15 AM PDT

I have the following XML statement, where I want the union of several tables to be presented as XML.I get this error: The FOR XML clause is invalid in views, inline functions, derived tables, and subqueries when they contain a set operator. To work around, wrap the SELECT containing a set operator using derived table syntax and apply FOR XML on top of it.My SQL:select ISNULL((SELECT top 10 au.sex,au.firstname,au.middlename,au.lastname,am.email,c.id AS objectid,c.title AS objecttitle,c.friendlyurl as objecturl,3 as objecttype,am.CreateDate FROM aspnet_users au INNER JOIN aspnet_membership am ON am.userid=au.userid INNER JOIN cameras c ON c.userid=au.userid WHERE c.indexboost=0 UNION ALL SELECT top 10 au.sex,au.firstname,au.middlename,au.lastname,am.email,c.id AS objectid,c.title AS objecttitle,c.friendlyurl as objecturl,1 as objecttype,am.CreateDate FROM aspnet_users au INNER JOIN aspnet_membership am ON am.userid=au.userid INNER JOIN locations c ON c.userid=au.userid WHERE c.indexboost=0 AND c.id NOT IN (SELECT objectid FROM emailssent WHERE category=c.objecttype AND emailid=2) order by am.CreateDate asc FOR XML RAW, ELEMENTS ,ROOT ('user')),0) as recordsHow can I fix it to get it to work?Thanks!

[SQL Server 2008 issues] select first alphabet

[SQL Server 2008 issues] select first alphabet


select first alphabet

Posted: 18 Mar 2013 05:27 PM PDT

i have a column in my table which has values like 'value Based', 'value Discount', 'Percentage Discount'.I want to extract first letter from these words as 'VB', 'VD', 'PD'.How to achieve this?

EXEC sp_MSforeachdb @command

Posted: 18 Mar 2013 05:00 PM PDT

Hi All,Apology i'm quite new to commands.I'm trying to loop the below SQL against all the DBs in the instance.But getting tthe below error:Msg 102, Level 15, State 1, Line 11Incorrect syntax near ' +ob.name +'.Anyone can help?thanks!DECLARE @command varchar(1000) SELECT @command = 'USE [?] DECLARE @dbid int SELECT @dbid = DB_ID() SELECT name from sys.databases where database_id=@dbid; SELECT db.name dbname, ps.OBJECT_ID,ob.name table_name, ps.index_id,b.name index_name, ps.page_count , ps.avg_fragmentation_in_percent, CASE when ps.avg_fragmentation_in_percent > 10 then 'ALTER INDEX "' + b.name + '" ON ' +ob.name +' REORGANIZE' END AS 'action_to_take' FROM sys.dm_db_index_physical_stats (db_id(), NULL, NULL, NULL, NULL) AS ps, sys.indexes as b, sys.objects as ob, sys.databases as db where ps.OBJECT_ID = b.OBJECT_ID and ps.object_id=ob.object_id AND ps.index_id = b.index_id and ps.database_id = DB_ID() and ps.database_id=db.database_id and b.name is not null and ps.page_count>1000 and ps.avg_fragmentation_in_percent > 10 ORDER BY ps.OBJECT_ID' EXEC sp_MSforeachdb @command

How to purge Transaction Log

Posted: 18 Mar 2013 05:17 PM PDT

Hi,I tried many thing to purge transaction log as suggested on google,But its size does not reduce.Plz Help.How to do it?

The log for database is not available; Database was shutdown due to error 9001 in routine 'XdesRMFull::Commit'.

Posted: 04 May 2011 07:39 AM PDT

Testing and Production db's have lost connection to the translog 2 days in a row now. Today, my prod db was marked suspect after the issue - SCARY. The other db's did not lose connection. Possibly because there was no activity at that moment. No errors in SQL log, only windows. Server Resources were not necessarily be hammered. I will be scouring the web, but wanted to reach out to all of you as well. See info and errors below.Plenty of available drive space for Log, db, and tempdb partitions. 144gb RAMSQL Server 2008 SP1; Enterprise (64-bit)OS: Win Server 2008 R2 Enterprise[b]Win app logs:[/b]error1- LogWriter: Operating system error 21(The device is not ready.) encountered.error2 - The log for database (testing) is not available. Check the event log for related error messages. Resolve any errors and restart the database.info mess3- Database was shutdown due to error 9001 in routine 'XdesRMFull::Commit'. Restart for non-snapshot databases will be attempted after all connections to the database are aborted.[b]2 seconds later prod db goes down:[u][/u][/b]error4- The log for database is not available. Check the event log for related error messages. Resolve any errors and restart the database.error5 - During undoing of a logged operation in database, an error occurred at log record ID (86400:39070:17). Typically, the specific failure is logged previously as an error in the Windows Event Log service. Restore the database or file from a backup, or repair the database.error6 - fcb::close-flush: Operating system error (null) encountered.error7 - An error occurred during recovery, preventing the database (PRODUCTION :w00t:) from restarting. Diagnose the recovery errors and fix them, or restore from a known good backup. If errors are not corrected or expected, contact Technical Support.info mess8 -CHECKDB for database finished without errors on 2011-03-14 12:12:41.503 (local time). This is an informational message only; no user action is required.

Insertion in table

Posted: 17 Mar 2013 09:28 PM PDT

HI,This is the table structure and script below:CREATE TABLE [dbo].[GV_Booklet]( [BookletId] [int] IDENTITY(1,1) NOT NULL, [LeafCount] [nchar](10) NULL, [Denomination] [int] NULL, [VoucherTypeId] [int] NOT NULL, [VoucherNo] [varchar](20) NOT NULL, [Quantity] [int] NULL, [CreateDate] [datetime] NULL, [ModifyDate] [datetime] NULL, [ExpiryDate] [datetime] NULL, [UserId] [int] NULL, [VoucherStatusId] [int] NOT NULL, [TransactionID] [varchar](20) NOT NULL, CONSTRAINT [PK_GV_Booklet] PRIMARY KEY CLUSTERED ( [BookletId] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY]Script:Declare @vouchertype varchar(20) = 'Percentage Discount', @denomination int, @userid int, @voucherstatus int, @transactionID varchar(20), @quantity int = 2, @count int =1, @innercount int =1, @leaf int =10, @max intWHILE @count <= @quantity BEGIN WHILE @innercount < @leaf BEGIN SET @max = (select vouchervaluebased from gv_maxvouchervalue) SET @transactionID = (select vouchertransactionID from gv_maxvouchervalue) INSERT INTO GV_Booklet VALUES ( @leaf, 100, 1, 'VB' + Replace( Str( @max+1, 7), ' ', '0') , @quantity, GETDATE(), GETDATE(), DATEADD(MM,6,GETDATE()), 1, 1, 'TRN' + Replace( Str( @transactionID + 1, 7), ' ', '0') ) SET @innercount = @innercount + 1 UPDATE gv_maxvouchervalue SET vouchervaluebased = @max + 1 END SET @innercount = 1 SET @count = @count + 1 ENDUPDATE gv_maxvouchervalueSET vouchertransactionID = @transactionID + 1select * from GV_BookletNow let me explain you guys the scenario-- This query is for a voucher booklet. If the quantity is 2 and leaf ( describing pages in a booklet) is 10 then 20 unique vouchersno should get generated with 2 unique Transactionno as quantity of booklet is 2. How to achieve this?plz help

Alter numeric(12,6) to numeric(19,4) - is it metadata only or is it recreate table?

Posted: 18 Mar 2013 12:23 AM PDT

Would be most grateful for confirmation/infirmation of theory below:Changing an existing table column from NUMERIC (12,6) to NUMERIC (19,4) should be a metadata change only (in-place - no need to dump the table to tempdb and recreate) as the column size remains the same - 9 bytes. It will require "examination" of all rows but not a recreate.This is extremely important as I intend to alter a table with 1 billion rows.TIA

existing column Level encryption in sql server2008

Posted: 18 Mar 2013 04:32 PM PDT

Hi, how to set the column level encyption in sql server 2008 in production enviroment without drop existing column and what are method available for encrypt the data in sql server 2008 standard edition.

2008 Central Management Server - Schedule Multi Server Query?

Posted: 08 Oct 2010 12:38 AM PDT

Hi Everyone - I have a 2008 Central Management Server that we're able to run queries from against all our SQL servers. I'm wondering if there's a way to do this via the command line or a job so the queries can be automated and rolled up into reporting services reports. Any suggestions?

ROWLOCK

Posted: 18 Mar 2013 05:03 PM PDT

CREATE TABLE [dbo].[TESTROWLOCK]( [ID] [int] NULL, [NAME] [varchar](100) NULL, [SURNAME] [varchar](200) NULL) ON [PRIMARY]GO[b]TAB -1 [/b]I m trying BEGIN TRANSELECT ID FROM TESTROWLOCK WITH(ROWLOCK) WHERE ID=2[b]TAB-2 [/b]SELECT ID FROM TESTROWLOCK WHERE ID=2I want to ask that tab-2 select is give result but tab-1 is not commited why rowlock hint does not lock that rows

reorganize index

Posted: 18 Mar 2013 05:16 PM PDT

Hi Guys,I'm going to reorganize indexes with > 10%.Is there a need to update statistics after the re-org?Understand that if it's a rebuild, SQL server will automatically update the stats.My question is, does re-org need a update statistics? Is the old statistics still accurate after re-org?In Micrsoft Note:Operations such as rebuilding, defragmenting, or reorganizing an index do not change the distribution of data. Therefore, you do not need to update statistics after performing ALTER INDEX REBUILD, DBCC REINDEX, DBCC INDEXDEFRAG, or ALTER INDEX REORGANIZE operations. The query optimizer updates statistics when you rebuild an index on a table or view with ALTER INDEX REBUILD or DBCC DBREINDEX, however; this statistics update is a byproduct of re-creating the index. The query optimizer does not update statistics after DBCC INDEXDEFRAG or ALTER INDEX REORGANIZE operations. thanks

merge statement insert issue

Posted: 18 Mar 2013 04:56 PM PDT

iam using merge statement in my proc.target table have lineid primary key.if i get not matched rescords from source table.below error message is displayedMsg 2627, Level 14, State 1, Line 7Violation of PRIMARY KEY constraint 'PK__t1__32489DA531F75A1E'. Cannot insert duplicate key in object 'dbo.t1'.The statement has been terminated.i given example code below create table t1(id varchar(10),ids int ,sal int,updatedDate datetime,lineId int primary key)insert into t1select 'r1',1,10,getdate(),1union allselect 'r1',3,30,getdate(),2union allselect 'r1',4,40,getdate(),3union allselect 'r1',5,50,getdate(),4union allselect 'r2',1,100,getdate(),5union allselect 'r2',3,200,getdate(),6----------------create table t2(id varchar(10),ids int ,sal int )insert into t2select 'r1',1,110union allselect 'r1',3,30union allselect 'r1',40,400union allselect 'r1',55,550union allselect 'r2',1,101------declare @maxs intselect @maxs=max(lineId) from t1-----------------------------------DECLARE @T TABLE(id varchar(10),ids int);MERGE t1 AS TUSING t2 AS SON t.id = s.id and t.ids = s.ids --WHEN NOT MATCHED BY TARGET -- THEN INSERT VALUES (s.id, s.ids, s.sal,getdate()-1)WHEN MATCHED AND (t.sal != s.sal) THEN --Row exists and data is different UPDATE SET t.sal = s.sal, t.updatedDate=getdate()WHEN NOT MATCHED BY TARGET THEN INSERT(id,ids,sal,updateddate,lineid)VALUES (s.id, s.ids, s.sal,getdate()-1,@maxs+1); how to insert primary key columns.(inserts new records based on maximan lineid+1 )note : here i dont add identity constraint for that line id column.can you give me suitable solution

Print error message

Posted: 18 Mar 2013 12:45 AM PDT

Hi, here is the script below:In this script I am matching voucher type and according to that I am inserting the data in the table. But when I write wrong voucher type script keeps executing so I want to print to stops the execution of the script and prints a error message if voucher type is not matched.Declare @vouchertype varchar(20) = 'Value Based, @denomination int, @userid int, @voucherstatus int, @transactionID varchar(20), @quantity int = 1, @count int =1, @innercount int =1, @leaf int =10, @max intWHILE @count <= @quantity BEGIN WHILE @innercount <= @leaf BEGIN SET @max = (select vouchervaluebased from gv_maxvouchervalue) SET @transactionID = (select vouchertransactionID from gv_maxvouchervalue) if @vouchertype = (select VoucherType from GV_VoucherType where VoucherType = 'Value Based') BEGIN INSERT INTO GV_Booklet VALUES ( @leaf, 500, 1, 'VB' + Replace( Str( @max+1, 7), ' ', '0') , @quantity, GETDATE(), GETDATE(), DATEADD(MM,6,GETDATE()), 1, 1, 'TRN' + Replace( Str( @transactionID + 1, 7), ' ', '0') ) SET @innercount = @innercount + 1 UPDATE gv_maxvouchervalue SET vouchervaluebased = @max + 1 END END SET @innercount = 1 SET @count = @count + 1 UPDATE gv_maxvouchervalue SET vouchertransactionID = @transactionID + 1 ENDselect * from GV_Booklet

Few tricky Questions about SQL Server 2008 that I came across...

Posted: 10 Jan 2012 06:31 AM PST

Few Questions about SQL Server 2008 that I came across. [b]Q: Which of the following allows for the truncation of a transaction log?[/b]A. Transaction logs are truncated when you use SQL Profiler.B. Transaction logs are truncated when you use the following syntax: truncate table AC. Transaction logs are truncated when you backup your database.D. You can not backup transaction log.[b]Q: Which of the following statements about unique Constraints are TRUE?[/b]A. You can have only one unique constraint applied to a table.B. Unique constraints can be used to establish a link between two tables.C. You can only have one column assigned to a unique constraint.D. If a primary key was already assigned to a table, the unique constraint would identify a special column used for indexing, [b]Q: Which of the following is an improved performance feature that uses bitmap filters for selecting minimal rows with queries?[/b]A. Table partitionsB. Star JoinC. Where clauseD. Outer Join[b]Q: Using a join in your select statement instead of a where clause is more efficient because the join pulls less data into your working memory.[/b]A. TrueB. False[b]Q: Full text indexes can be backed up and restored with the rest of the database due to the ...[/b]A. ...new index organization systemB. ...overall size of the index getting smaller.C. ...index becoming a part of the transaction logs.D. ...indexes no longer being stored outside of the database.[b]Q. Which of the following statements, if used for creating a new login id, will result in an error?[/b]A. CREATE LOGIN [HumanResources/JohnSmith] FROM WINDOWSB. CREATE LOGIN AndrewWalters WITH PASSWORD '!^%&7', DEFAULT DATABASE = HRC. CREATE LOGIN John WITH PASSWORD '*&!@*&'D. CREATE LOGIN [HumanResources/Jim] FROM WINDOWS, DEFAULT DATABASE=CRM[b]Q. Which of the following statements can be used to provide sorted result set based in the CustomerName column?[/b]A. SELECT ROW_NUMBER() AS RN, CustomerName from Customers order by CustomerNameB. SELECT ROW_NUMBER() OVER (ORDER BY CustomerName) AS RN, CustomerName FROM CustomersC. SELECT ROW_NUMBER() (ORDER BY CustomerName) AS RN, from CustomersD. SELECT ROW_NUMBER() AS RN, CustomerName from Customers

Trace - to Capture all or most queries

Posted: 18 Mar 2013 01:49 AM PDT

A. Are black box and defaula traces same?B. Is it possible to capture all the queires in the black/default trace, if it does not degrade sql performance.

SSRS 2008R2 showing counties (not just states)

Posted: 08 Mar 2013 01:53 PM PST

I have a hopefully simple question. SSRS 2008R2 can do maps by state just fine... is there an easy way to get City>County>State rollup data somewhere or get shapefiles for Counties in the US? I'm working on a database for someone that would be infinitely more useful if I could show like a heat map for sales by county across the country... there's a really cool map [url=http://www.mssqltips.com/sqlservertip/2552/creating-an-ssrs-map-report-with-data-pinpoints/]here[/url] that shows the county lines in it... and that's the part I want - the Counties. The granularity of the data I have is not too good, so county-level or so is about right.Is there an easy way to create a map like that? (like a color-coded map from election night, but a county-by-county instead of state-by-state?) If so, how? And where would I get the shapefiles for the counties?Thanks!Pieter

entity framework database

Posted: 18 Mar 2013 10:38 AM PDT

We are going to develop a new .net application that we think we may use entity framework model.And as I understand it can generate code like insert, update but not use stored procedures.Can anyone recommend articles or tutorial of how to develop the model and also database development cycle. and pros and cons by using insert, update in code in this model instead of using stored proceduresThanks

Adding a min, max and avg column

Posted: 18 Mar 2013 09:11 AM PDT

I have a table like so.Table A [Job Lot] [Job ID] [Child Job ID] [Time Completed] [Order of Jobs Done] 1 11 100 10 Pos 1 1 11 101 20 Pos 2 1 11 103 30 Pos 3 2 12 101 30 Pos 1 2 12 105 10 Pos 3Each Child Job ID represents a delivery to a particular site like New York, Washington or LA each week.So I need to work work out say for Child ID 101 what is the average postion . I see in JOB ID 11 it has Child Job ID 100 in POS 2 and in JOB ID 12 it is in POS 1, so the min is POS1 , the max is POS2, and the average is POS 1.5. So what is the spread of CHILD JOB ID and position is the real question , so out of 4 CHILD JOB IDS of say 101 what is the spread, is it always usuallt at postion 1.For CHILD JOB ID 10 , I want the lowest rank of Position 0 , the highest rank value of Position 3 and the Average vlaue not considering any null values and only divide by number of non null vlaues. In this case the the average rank would be Postion 2.So I need to add 3 columns to store these values ? How can I do this in one query ? Do I need to pivot the data first ?SELECT [JOB LOT],[CHILD JOB ID], MIN([ORDER OF JOBS DONE])AS [MIN],MAX([ORDE OF JOBS DONE])AS [MAX] FROM TABLE A WHERE [ORDER OF JOBS DONE] IS NOT NULL group by [JOB LOT],[ORDER OF JOBS DONE]

sp_updatestats and the default sampling rate

Posted: 04 Mar 2013 08:51 AM PST

Using SQL Server 2008R2As my Production database approaching 500GB and potentially can grow up to 1TB in 2 years, issue with sp_updatestats (after re-indexing part of maint step) using the default sampling rate (which potentially can skew the performance) bothers me.It has been a discussion earlier at that link: http://www.sqlservercentral.com/Forums/Topic1310877-146-2.aspx, but I still confused about using sp_updatestatsCurrently I am performing the following steps during weekend maintenance:1. ALTER Database MyDBSET RECOVERY Simple,AUTO_CREATE_STATISTICS OFF,AUTO_UPDATE_STATISTICS OFF2. My index maintenance routine based on the following criteria:Currently I Reindex Clustered and Non-Clustered Indexes when avg_page_space_used_in_percent < 75 and avg_fragmentation_in_percent > 10 and page_count > 500.Of those selected, if Fragmentation <=30, than I reorganize those Indexes. If Fragmentation > 30, than I rebuild those Indexes. So at the end of the Reindex Maint I have Non-Clustered and Clustered Indexes either Rebuilt or Reorganized.3. Currently I am running the Update Statistics on a whole database after previous reindex step:[b]sp_updatestats[/b]Since sp_updatestats updates statistics by using the default sampling rate, it possibly can deteriorate all my indexes after running reindex routine.[b]3A.[/b] So I was thinking about "… update statistics for all tables which are defragged and don't run update statistics for all tables which are REBUILD"http://sqlserverpedia.com/wiki/Updating_StatisticsSo here is my logic for performing routine in [b]3A[/b]Select indexes which were "REORGANIZE"d (no statistics update) during reindex maintenance along with other indexes, where statistics were either not updated for the last, say, few weeks and run the following:UPDATE STATISTICS Table_Name (IndexName) WITH FULLSCAN, NORECOMPUTE against indexes selected above.By running above I will be able to update statistics without running sp_updatestats4. ALTER Database MyDBSET RECOVERY Full,AUTO_CREATE_STATISTICS ON,AUTO_UPDATE_STATISTICS ONPlease let me know if you have any comments, suggestions, recommendations on [b]step 3A[/b].It has been a suggestion earlier to run: USE dbname;EXEC sys.sp_MSforeachtable @command1 = N'UPDATE STATISTICS ? WITH FULLSCAN, COLUMNS;';But unfortunately it takes way to long on my Production Database, given the time frame I have for the weekend maintenance.

BombProof Data Import from Excel ?

Posted: 18 Mar 2013 06:22 AM PDT

I have a very large spreadsheet of 125,000 rows, 200 columns, some of which contain a lot of text. I have tried every method I can think of to import the data into SQL such as converting to .csv, tab delimited text, xls with just 60000 rows ... I keep getting hung up with data truncation or something, even when I tell the Import wizard to ignore truncation. and my SQL table has all varchar(max) columns.Is there some foolproof way force the data into a sql table, either predfined or created on the fly, then I can worry about what the data looks like after it's in SQL ? I tried BULK INSERT 2 and get non-useful errors and just get the column headers.I installed the Office 12.0 OLE DB Provider, but get errors when trying to use it with a .XLSX file .... very frustrating 6 hours. I have a SQL 2008 (R0) server and SQL 2005 servers to work with.Suggestions ?

deadlock graph no statement information

Posted: 18 Mar 2013 08:03 AM PDT

In a deadlock graph, hovering over the oval for the victim only shows "Statement:". The other oval shows Statement: followed by the stored procedure that was executed.Does anyone know why the victim oval isn't displaying the statement information?Microsoft SQL Server 2008 R2 SP2 Enterprise Edition (64-bit)

sql agent job output file to different server has error

Posted: 18 Mar 2013 03:47 AM PDT

I have a SQL agent job that selects from a table and using the advanced options I can send the results to a test file. It works fine. But I'd like to send the file to a different server and am getting an error: "Executed as user: MGH\sqlagent. Unable to open Step output file. The step succeeded."I added permission for the sql agent service account on the other server. I'm using output path like: \\servername\foldername\filename.txtIs that syntax correct? Am I missing some permission on the other server?Thanks very much for any help.

Static port on named instance in a cluster.

Posted: 18 Mar 2013 05:38 AM PDT

I have a two node SQL cluster (2008R2). Each node on the cluster hosts a named instance. I am having a problem upgrading my VMWare vCenter to 5.1. Here is an article I found about the issue and the resolution. [url]http://kb.vmware.com/kb/2039092[/url]My question is can I make this change without screwing up my cluster? Will it have ill effect duing a failover event? If I can make this change what issues might I run into later since I have switched from dynamic to static ports? Can I make this change on only one named instand and not the other?

query help

Posted: 08 Mar 2013 03:11 AM PST

Hi All,can any one help on below query declare @tbl table (TBLID int identity(1,1),ID int,patientName varchar(10),age int,city varchar(100),Mobile int)insert @tbl(ID,patientName,age,city,Mobile) select 1,'Ramesh',20,'HYD',12345678 union all select 1,'Ramesh new',20,'HYDERABAD ' ,12345678 union all select 1,'Ramesh new',20,'HYDERABAD ' ,87654321 select * from @tbl TBLID ID patientName age city Mobile1 1 Ramesh 20 HYD 123456782 1 Ramesh 24 HYD 123456783 1 Ramesh new 20 HYDERABAD 87654321i want output as mentioned below format which columns data got changed Columns OLDDATA NEWDATApatientName Ramesh Ramesh newCity HYD HYDERABAD Mobile 12345678 87654321please prepare any dynamic query

Primary key or not to primary key, that is the question

Posted: 04 Mar 2013 02:58 AM PST

Yeah, bad taste on the subject but nonetheless, you're reading my post :). I'd like to pose a scenario and see what other developers are doing out there (or would recommend at least).We have most of our tables defined with a PRIMARY KEY constraint over a single column that is defined as UNIQUEIDENTIFIER with a default of NEWID(). This is never clustered unless the table is small. We use this in our child tables as the foreign key reference like so (excuse the pseudo table definitions):Table:ParentID UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,Field VARCHAR(30) NOT NULL,Code INT NOT NULLStatus VARCHAR(3) NOT NULLTable:ChildSurrogateID INT IDENTITY(1, 1) NOT NULL, --has a unique clustered index definedID UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,Field1 VARCHAR(30) NOT NULL,ParentID UNIQUEIDENTIFIER NOT NULL (FK defined to Parent.ID)Obviously the primary key constraint maintains an index behind the scenes. When we query between the two tables it will look soemthing like this:SELECT P.ID, P.Field, P.Code, P.Status, C.ID, C.Field1FROM dbo.Parent AS P INNER JOIN dbo.Child AS C ON P.ID = C.ParentIDLooks simple right? This query should (and does for me) do an index seek using the primary key constraint. But it also generates an expensive key lookup (clustered) over the clustered index. The only way to resolve that is to add included columns in the primary key to resolve it.Yes... primary keys indeed *DO NOT* allow included columns to be defined. At this point I have one of two options:- Drop the PRIMARY KEY and simply make a unique index with the necessary included columns- Create a unique index over Parent.ID with the neccessary included columns. However this results in doubling the index storage required for the Parent.ID column.So the question is this, do you:- drop the primary key and create the unique index with included columns- duplicate the primary key and create the unique index with included columnsI'm leaning towards dropping the PK and creating the unique index myself. Thoughts?

Help with XML Explicit and nesting

Posted: 18 Mar 2013 03:30 AM PDT

Hello,I currently have a very basic SQL statement to dump my results out in XML format:SELECT 1 AS tag, NULL AS parent, BookingID as [Event!1!BookingID!cdata], WebTitle as [Event!1!EventName!cdata], Room AS [Event!1!Room!Element], Audience as [Event!1!Audience!cdata] FROM MyEvents ORDER BY TimeBookingStart FOR XML EXPLICITThe resulting XML looks like this:<Event> <BookingID> <![CDATA[14957]]> </BookingID> <WebTitle> <![CDATA[This is the title]]> </WebTitle> <Room> Room 1255 </Room> <Audience> <![CDATA[Private]]> </Audience></Event>Now I would like to add a new element for EventType. There can be more than one EventType specified per <Event>. Can someone show me how to add an additional element that may include several distinct values? For example, booking Id 14957 has two records in the database each with a different eventtype value.Thank you in advance.

How to Zip a folder with files using Execute process task IN ssis

Posted: 17 Mar 2013 11:07 PM PDT

Hi,i need to zip a file in a folder using SSIS , so i tried with "Execute Process Task".I have a problem there what should i give in Executables , Arguments,Working directory options inExecute process taskThanks in Advance:-)

Differential backups being blocked, CHECKPOINT command

Posted: 18 Mar 2013 03:04 AM PDT

All my differential backups are being blocked by a SPID which has a command of CHECKPOINT.Is this something to be concerned about ? If so how would one go about resolving it?Many Thanks

Top N makes query run faster

Posted: 17 Mar 2013 11:13 PM PDT

I have a select query which brings back 35k rows. it runs in approx 5 seconds.I moved this to a new server and it runs for around an hour before i give up and kill the process.I put in select top 35000 .......... at the start of the query and i can get it to run in just under 4 minutes.The query runs across two servers to fetch the data using a left outer join to pull back everything from server 1 regardless of a match on server 2Where do i start looking to find the bottle neck ?I've attached the plan.

First and Last number

Posted: 17 Mar 2013 10:14 PM PDT

I have table in which i have data like - Columns are VoucherNo and TransactionIDPD0000021 TRN0000003PD0000022 TRN0000003PD0000023 TRN0000003PD0000024 TRN0000003PD0000025 TRN0000003PD0000026 TRN0000003PD0000027 TRN0000003PD0000028 TRN0000003PD0000029 TRN0000003PD0000030 TRN0000003VB0000001 TRN0000001VB0000002 TRN0000001VB0000003 TRN0000001VB0000004 TRN0000001VB0000005 TRN0000001VB0000006 TRN0000001VB0000007 TRN0000001VB0000008 TRN0000001VB0000009 TRN0000001VB0000010 TRN0000001Now here there are no of vouchersno for a particular TransactionID. I want first and last voucher for every transactionID e.g. For TRN0000003 i want PD0000021 and PD0000030

SQL Server Database Error

Posted: 17 Mar 2013 10:29 PM PDT

Hi Experts,Here is an error that has just posted in the system center operation manager(SCOM).Event ID: 9511 from Source Office Server Search on agent computer server_namehas triggered this Alert Description : Errors occurred during database operation,please make sure the SQL server is running and there is sufficient disk space available on the database drives.Context: Application 'database_name', Catalog 'Portal_Content'summary:In a search-enabled server farm, the index server periodically writes to or reads from the SQL Server database. This alert indicates that an unexpected SQL Server database error occurred while the Office SharePoint Server Search (OSearch) service that is running on the index server attempted to communicate with the database.i will appreciate your efforts.Regards.shafiq

Full Text Search with multiple keyword or Text Phrase

Posted: 17 Mar 2013 11:26 PM PDT

Hi All,Recently, we had a requirement to implement keyword based search & user can enter only one "word" means one keyword. So, I used LIKE 'keyword%. & It was working fine.But now user can enter a keyword or keywords or any text phrase. So, I decided to use Full Text Search & implemented FTS. But now there is an issue. Desired Results are not coming.I am not able to use CONTAINS because it is giving error due stopwords. So, I am using FREETEXT.I just consider a very simple common example.E.g: 1. suppose user enter a Text "mobile" & this returns 1000 rows(this count is just an example. My query will return several thousands of rows).2. Now, If I change the text to "samsung mobile" then it should return those results which contain "samsung" and "mobile" both keywords.But FREETEXT returning all those rows which contains either "samsung" or "mobile" or both.3. Now, I again change search to "samsung mobile for sale in delhi". The result set should be narrow. But this is not happening. As you see, this search text contains stopwords, so I am not able to use "CONTAINS()", so I used FREETEXT().I have read several articles on FTS, but i didn't find the solution of this problem.Please provide me some guidance to sort this issue.

Search This Blog