Wednesday, April 3, 2013

[how to] Is SQL Server data compression categorically good for read-only databases?

[how to] Is SQL Server data compression categorically good for read-only databases?


Is SQL Server data compression categorically good for read-only databases?

Posted: 03 Apr 2013 03:10 PM PDT

Some literature on SQL Server data compression I read state that the write cost increases to about four times what would normally be required. It also seems to imply that this is the primary downside to data compression, strongly implying that for a read-only archive database, the performance will (with few excep tions) be improved by the usage of data compression of 100% filled pages.

  1. Are the statements above true?
  2. What are the primary "variations" between data compression and otherwise (for reading)

    • "CPU +x%"?
    • "IO -y%"?
    • page split occurence?
    • tempdb usage?
    • RAM usage?
  3. And for writing?

For the purpose of this question, you can limit the context to PAGE-level compression of a big (> 1TB) database, but additional comments are always welcome.


References:

SQL Server Storage Engine Blog (The DW scenario shows compression to be very advantageous)
Data Compression: Strategy, Capacity Planning and Best Practices

A more detailed approach to deciding what to compress involves analyzing the workload characteristics for each table and index. It is based on the following two metrics:

U: The percentage of update operations on a specific table, index, or partition, relative to total operations on that object. The lower the value of U (that is, the table, index, or partition is infrequently updated), the better candidate it is for page compression.
S: The percentage of scan operations on a table, index, or partition, relative to total operations on that object. The higher the value of S (that is, the table, index, or partition is mostly scanned), the better candidate it is for page compression.

Both of the above are demonstrably biased towards recommending page compression for DW-style databases (read-intensive/exclusive, big-data operations).

What's are the different ways to keep track of active and archived data?

Posted: 03 Apr 2013 06:47 PM PDT

I'm looking for different ways to keep track of both active and archived data so I can pro and con them.

The system: I have a computer with a database on it. The database has several tables in it; one of which contains a list of users that can use the computer; and several tables for auditing (user 1 did this, user 2 did that, etc). This database is a slave of a master database in which a Content Management System is used to say, add a new user and see reports on what user did what.

Example: As stated above, I have a table (lets call it users) that keeps track of all the users that are allowed to use the computer. As time goes by users will be added and removed. The problem is the audit tables keep track of a user ID so if the user is removed I lose the user information because the rows can't be joined. One idea I had was to use MySql's triggers so that if a user is added, an insert trigger is triggered and inserts a copy of the data to an 'archived' user table (lets call it users_archive). That way the computer can use users to determine if the user has permission to use it and reports can use users_archive for reports.

This seems like the easiest and most simple way to do it, but I can't find any other ways via google search to see if there are any other ways to do something like this.

Deployment Manager Vs DBA Cage Match On Pushing Releases To Prod

Posted: 03 Apr 2013 05:01 PM PDT

Our company has now hired a full time release engineer for the MS Windows side and we were thinking of a release management process for MS SQL Server.

IF we have confidence in their abilities to deploy SQL Server scripts, is it common to just have the release manager do it or is it something that the DBAs typically do? Our enterprise is growing fast and our DBA's are somewhat overloaded (surprise surprise what else is new) but we can hire more, but that's another matter.

How do you guys manage this? Do you allow a release manager to have access to prod to roll out changes? Do you take away rights and turn them on when they need to release? I'm thinking I will give them access to a sproc that gives them prod access for an hour, but it logs who calls it.

Or am I totally off, and this is something a DBA should always manage?

Any thoughts would be appreciated!

Edit:

Update: Also what happens when we encounter anomoloies? For example, a dev stated that 'these tables should match this other environment (by environment I mean customer prod environment, not qa/stage/etc.)'. Usually they would do a spot check. I did a checksum and noticed issues which ended up being just whitespace issues. In a case like this, do we push it back to the release manager/qa person to fix after doing basic troublehshooting?

Another example: We have scripts by about 20 developers, sometimes they are dependent on each other. The ordering of the script was wrong. I can't keep up with 20 developers work and also manage the data, but after some troubleshooting we discovered the issue and changed the order. Is this something the DBA should typically be deeply involved in or is it fair after basic testing and look over, we send it back to the release manager and devs to fix?

mysql cloned server with better hardware have worse performance

Posted: 03 Apr 2013 02:16 PM PDT

I am intrigued. I have migrated one mysql server to another machine (both are simple desktop computers) and the new machine has performed worse than the old one. Comparison between old-new is like this:

  • RAM: 2Gb to 4Gb
  • Disk: 77Mb/s to 133Mb/s
  • CPU: Core2Duo E8400 (11969.53 BogoMIPS) to Phenom X2 Dual (13600.96 BogoMIPS)
  • Ubuntu: 10.10 to 12.10
  • MySQL: 5.1 to 5.5 (default distrib package)

I have installed pt-config-diff from percona toolkit and it shows:

root@ametista2-sp:~# pt-config-diff h=localhost h=192.168.11.249 --ask-pass  Enter MySQL password:   Enter MySQL password:   32 config differences  Variable                  ametista2-sp (new)        ametista-sp (old)  ========================= ========================= =========================  collation_connection      utf8_unicode_ci           utf8_general_ci  collation_database        utf8_unicode_ci           utf8_general_ci  collation_server          utf8_unicode_ci           utf8_general_ci  completion_type           NO_CHAIN                  0  concurrent_insert         AUTO                      1  datadir                   /dados/mysql-databases/   /var/lib/mysql/  general_log_file          /dados/mysql-databases... /var/lib/mysql/ametist...  hostname                  ametista2-sp              ametista-sp  ignore_builtin_innodb     OFF                       ON  innodb_change_buffering   all                       inserts  innodb_file_format_check  ON                        Barracuda  innodb_version            1.1.8                     1.0.17  log_slow_queries          OFF                       ON  long_query_time           10.000000                 2.000000  max_binlog_cache_size     18446744073709547520      4294963200  max_seeks_for_key         18446744073709551615      4294967295  max_write_lock_count      18446744073709551615      4294967295  myisam_max_sort_file_size 9223372036853727232       2146435072  myisam_mmap_size          18446744073709551615      4294967295  optimizer_switch          index_merge=on,index_m... index_merge=on,index_m...  pid_file                  /var/run/mysqld/mysqld... /var/lib/mysql/ametist...  slow_query_log            OFF                       ON  slow_query_log_file       /dados/mysql-databases... /var/log/mysql/mysql-s...  socket                    /dados/mysql-databases... /var/run/mysqld/mysqld...  sort_buffer_size          2097152                   2097144  sql_auto_is_null          OFF                       ON  sql_slave_skip_counter    0                           storage_engine            InnoDB                    MyISAM  table_definition_cache    400                       256  table_open_cache          400                       64  version                   5.5.29-0ubuntu0.12.04.2   5.1.61-0ubuntu0.10.10....  version_compile_machine   x86_64                    i686  

For example, this query takes 20 secs on old and 30 secs on new (tables are the same - MyISAM engine):

3   17:24:06    SELECT       tblEmpresasPOSITIVAS_LEFTJOINCnae . *  FROM      tblEmpresasPOSITIVAS_LEFTJOINCnae  WHERE      (CO_CNPJ_CEI LIKE '00000000%')  ORDER BY CO_CNPJ_CEI ASC  limit 0,30000 5759 row(s) returned  

Am I missing something here ?

Is there a query to set a table's engine to innodb if it's not already?

Posted: 03 Apr 2013 06:52 PM PDT

I can set a (MyISAM) table's engine to InnoDB using the query:

alter table tablename engine=InnoDB  

which takes some time to complete for large tables. It appears though that if I run the same query again on the same table, it takes some time again (much less but still considerable). I would expect the query to be instantaneous, since the engine is already set to innodb.

My questions are:

  • Is there a single query that conditionally sets the engine, for example

    alter table tablename engine=InnoDB <if not already innodb>

  • Why does the second query have such a delay? (out of curiosity mostly)

Ghost replication jobs in replication monitor

Posted: 03 Apr 2013 01:13 PM PDT

Has anyone ever seen and know the cause of this issue? We have transactional replication with pull subscriptions. At 2am this morning it appeared that the job (that runs every 15 minutes) stalled out. We stopped and restarted the job. Transactions started flowing again and we assumed all was ok. We noticed that the update rate was really slow, and upon investigation we found what is in this image - although the job from the pull subscription is definitely stopped, it shows as running in replication monitor. We ended up restarting the SQL Agent on the subscription server and now the speed back to normal, but now we show three running jobs. I'd like to understand this better and determine the cause. We know that restarting the agent got us running again but there is still obviously something wrong.

Bizarre behavior

At exporting table to excel, double numbers are changed as date type [migrated]

Posted: 03 Apr 2013 12:06 PM PDT

I have in table column, his type is decimal and in him is stored numbers like this

2.3  2.0  3.3  4.2  

I want export this table in excel, I export table as csv file.

But when open csv file in excel, data like this 4.2 displayed as 04.02.2013

I need that 4.2 displayed as 4.2 in excel, how to make this?

SSIS Raw Data Import Into SQL Server 2008 R2 - Truncation and Condensed Data Type

Posted: 03 Apr 2013 11:00 AM PDT

[Intro] I have written a two pack SSIS solution to perform a scheduled hourly update of data from multiple tables in a single db (SRC) to single table on another db (TRG).

I get the following errors and I understand them:

[ADO NET Destination [328]] Error: An exception has occurred during data insertion, the message returned from the provider is: * String or binary data would be truncated. * The statement has been terminated.

I Know that the data from SRC in some columns is too large for the data in the TRG matching column. For instance SRC.CompositeAddress is nvarchar(50) and TRG.Addr is char(6) (same size as Condensed Data Type) which does not make sense for an address

[Question] My question revolves around not really understanding the condensed data type. Aside from data in SRC being too large for data in TRG I am wondering if condensed data type is affecting my SSIS import?

SSIS Data Flow Task Excel to SQL table NULL value will not work for small INT datatype

Posted: 03 Apr 2013 11:14 AM PDT

I have a package that imports excel data into a SQL table. One of the columns in the table is of "smallint" data type and it allows for nulls. The excel file column that has the data that is suppose to go into this column will always contain either small int or in some cases "NULL".

I have noticed that when my source excel file has a NULL in this particular column I get the following error

There was an error with input column "Copy of Place of Service" (8582) on input "OLE DB Destination Input" (8459). The column status returned was: "The value could not be converted because of a potential loss of data.".

If my source excel file only contains small int values then the package runs fine. I am not sure how I can fix this problem. I have tried changing data types in the Physical table as well as using different DataTypes (via the DataConversion component in SSIS) and I still keep getting the same error message.

Does anyone have any ideas of how I can get around this issue? Even though the column in the SQL table is checked to allow NULL I guess it is reading the NULL from the Excel source file as text or string and not as an actual NULL. Any suggestions?

Reset every Oracle sequence to the MAX value of its target table PK?

Posted: 03 Apr 2013 11:44 AM PDT

I finally got around to migrating from MySQL to Oracle and was pleasantly surprised just how well of a job the SQLDeveloper Migration tool did. It even took my AUTOINCREMENT fields and created sequences for my tables in their place. It also did a good job of migrating the data as well.

The only real loose end I see is that when it created the sequences, it did not take into account the existing data in the table and instead all of the sequences are starting NEXTVAL of 1.

I can imagine a simple PL/SQL script that will set the next value of the sequence based on the MAX value in the primary key column of a specific table to a specific sequence but then I have to do this over a hundred times and well I just don't have the patience for that.

I wonder if there is some way I can write a PL/SQL script that uses meta data in the SYSTEM schema to dynamically do this for every table/sequence pair in an entire user space? Does anybody have any other better ideas how to handle this? I ran out of interns BTW.

Determine if existing Trigger acts on field DB2

Posted: 03 Apr 2013 12:38 PM PDT

I am eventually going to be writing a trigger on a field named "user2". Currently most of the values in this field are blank, but there are some seemingly random rows with values of '00' for the field.

What I was wondering, is there anyway to determine if there is an existing trigger that is adding these values to the field? If not, what are some methods I can use to figure out how this data is getting there?

Thanks,

Andrew

Research studies on electronic vs paper record keeping accuracy

Posted: 03 Apr 2013 11:17 AM PDT

Does anyone know of any research studies to test whether electronic or paper record keeping is more accurate than the other? Obviously electronic record keeping is more efficient, but efficient doesn't necessarily mean better accuracy in recording information.

I have a client that is holding on dearly to their old paper system because of data accuracy conceptions. I'd love to be able to point to a research study that addresses which way of doing things is more accurate.

Database stuck in restoring and snapshot unavailable

Posted: 03 Apr 2013 11:00 AM PDT

I tried to restore my database from a snapshot. This usually took around a minute to complete the last couple of times. When I did it today, it didn't complete for around 30 minutes and the spid was in a suspended state. I stopped the query and now my database is stuck in restoring state and my snapshot is unavailable. Am I screwed?

USE master;  RESTORE DATABASE QA from   DATABASE_SNAPSHOT = 'QA_Snap_Testing';  GO  

Comparing two tables for a UUID change and fix it

Posted: 03 Apr 2013 01:26 PM PDT

I have two tables which I'm trying to reconcile the differences of in postgresql.

Table A is old and needs updating.

Table B is an updated, schema identical version of Table A which I have the data for in a temporary table in the database of Table A.

Unfortunately some time after the two databases diverged someone changed the UUIDs of records in table B and I need table A to match table B.

The schema for both tables is:

CREATE TABLE A  (      uuid VARCHAR(36) NOT NULL,      template_folder_uuid_parent VARCHAR(36),      heading VARCHAR(255) NOT NULL,      image VARCHAR(100),      downloaded BOOL NOT NULL,      allow_in TEXT NOT NULL,      template_group VARCHAR(255) NOT NULL,      country_filter VARCHAR(1024) NOT NULL,      user_id INT,      date_created TIMESTAMP,      date_modified TIMESTAMP,      is_modified BOOL NOT NULL,      image_path VARCHAR(255)  );  

I need to search through Table A and Table B and match records based on template_folder_uuid_parent and heading, then set the UUID of the Table A record to the UUID from Table B. Once changed in Table A the UUID will cascade correctly.

SQL Server 2008: How to send an email when a step in a SQL Server agent job fails, but overall job succeeds

Posted: 03 Apr 2013 02:34 PM PDT

I have a SQL Server job with 6 steps. Steps 5 and 6 must be run regardless of any failures in the first four steps, so these first four jobs are set to skip to step 5 if they fail.

However, if steps 5 and 6 then succeed, the whole job is regarded as a success. I have an email notification set up for the job failure, but I do not receive an email if any of the first four steps fail due to the overall job being considered a success. I would like this to happen.

It would not be ideal to split out the first four steps into a separate job, as they must be completed before steps 5 and 6 begin.

Please can anyone give me advice to solve this problem so that:

  1. Steps 5 and 6 run even when any of steps 1-4 fail.
  2. Steps 5 and 6 begin strictly not before steps 1-4 complete.
  3. When any of steps 1-4 fail, an email notification is sent indicating the step that failed.

Thanks very much in advance for your help.

Why should an application not use the sa account

Posted: 03 Apr 2013 11:25 AM PDT

My first question ever, please be gentle. I understand that the sa account enables complete control over a SQL Server and all the databases, users, permissions etc.

I have an absolute belief that applications should not use the sa password without a perfected, business person focused reason why. Answers to This Question include a lot of my reasoning for an IT focused discussion

I am being forced into accepting a new service management system that WILL NOT work unless it uses the sa password. I never had time to work out why when setting up an evaluation but the server team tried to install it to use a fixed role I had set up incorporating db_creater and other permissions I thought it would require. which failed. I then let the server team install with the sa account but run under an account in the dbo role for its database but that failed too. Grumpily I tried to get it to run with an account in the sysadmin role but even that failed and not with useful error messages that enabled me to work out what was going on without spending more time than I had available. It will only work with the sa account and the password stored in clear text in the config file.

When I queried this and the server team talked to the vendor they got the worrying answer of 'What's the problem with that?' and then 'well we can look at scrambling the password' scrambling ffs

I know that there are ways and means to restrict access to the file but it is just another weakness in the security in my opinion

Anyway, My question is, could someone point me at some documentation that I can use to explain to the business the reason why this is a bad thing and should be a big no no. I work in a field that means that I need to take security seriously and have been struggling to make the business understand and ultimately may be out-ranked anyway but I need to try.

Bandwidth comparison between log shipping and transactional replication

Posted: 03 Apr 2013 12:15 PM PDT

Which technique uses more network bandwidth:

  1. Log shipping
  2. Transactional Replication

Can some one share any benchmarks for the same ?

What would be the Memory and I/O impact on the Primary server when we use any one of the technique ?

Thanks,

Piyush Patel

Why upgrade SQL Server 2000?

Posted: 03 Apr 2013 02:23 PM PDT

I am responsible for a large legacy application, mostly written in classic ASP, running against SQL Server 2000. I have been considering upgrading to a newer version, but most of the information I've found online has been very general. Mostly, I've read that we'll benefit from performance gains, better security, and new features.

Can anyone lay out any specific reasons why we should upgrade? I have reviewed the upgrade process and the lists of breaking changes, behavioral changes, etc., and do not anticipate any problems with upgrading other than price. However, despite seeing many recommendations to upgrade, I've set to see any concrete reasons other than performance (not an issue for us).

Database design - do I need another table?

Posted: 03 Apr 2013 05:42 PM PDT

I am trying to make a database that follows a form that the company uses. When a client walks in the membes of staff have to fill in a form and the data is recorded. The form on paper is 10 pages long. The first time a client is seen the entire form is filled in and the client gets a clientID.

I have split the form into sections that make sense like accommodation and employment. I know I can link these tables together with the clientsID. Thats the simple part.

Now when a client returns the form comes out again but this time only certain parts are filled in, what ever the clients needs are. The records for most parts don't need updating but a new record needs inserting. what would be the best way around this.

So at the moment I have for example a table called client with an id and name another table called accommodation with clientid and address and another table employment with clientid and employer.

But how do I go about it when a client comes in to let us know he has a new employer. I cant update the current one as that is needed but I would need to add new record for the client.

Would this mean I would have to add a look up table for all my current tables?

event scheduler not called

Posted: 03 Apr 2013 01:01 PM PDT

I had created one event scheduler which looks like this

mysql> show create event event1      -> ;  +--------+----------+-----------+--------------------------------------------------------------------------------------------+  | Event  | sql_mode | time_zone | Create Event  | character_set_client | collation_connection | Database Collation |  +--------+----------+-----------+-----------------------------------------------------------------------+----------------------+----------------------+    | event1 |          | SYSTEM    | CREATE DEFINER=`root`@`localhost` EVENT `event1` ON SCHEDULE EVERY 1 MONTH STARTS '2013-02-02 00:00:00' ON COMPLETION NOT PRESERVE ENABLE DO BEGIN   update samp set col1 =col1  + 1; END | utf8                 | utf8_general_ci      | latin1_swedish_ci  |  +--------+----------+-----------+-----------------------------------------------------------------------+----------------------+----------------------+---------  -----------+  1 row in set (0.00 sec)  

This events has not called on 1st of month. So i tried show processlist\g to find it is runnung or not; it showed following o/p

mysql> show processlist\g;  +-----+-----------------+-----------------+---------------+---------+------+---     | Id  | User            | Host            | db            | Command | Time | State                       | Info             |  +-----+-----------------+-----------------+---------------+---------+------+---  | 136 | event_scheduler | localhost       | NULL          | Daemon  | 1855 | Waiting for next activation | NULL    |  

so NULL in db col means that no DB is assigned to it?

Please help me to solve it.

For a InnoDB only DB, which of these elements can be removed?

Posted: 03 Apr 2013 11:36 AM PDT

So, I'm trying to set up a Drupal 7 my.conf file that's combining best practices from various performance blogs. I'm realizing though that some of them are older than others, and many aren't assuming InnoDB. So of this list of configs, which are irrelevant if you're building for InnoDB.

[client]  port = 3306  socket = /var/run/mysqld/mysqld.sock    [mysqld_safe]  socket = /var/run/mysqld/mysqld.sock  nice = 0  open-files-limit = 4096    [mysqld]  port = 3306  user = mysql    default_storage_engine  default-storage-engine = InnoDB  socket = /var/run/mysqld/mysqld.sock  pid_file = /var/run/mysqld/mysqld.pid  basedir = /usr  tmpdir = /tmp  lc-messages-dir = /usr/share/mysql  local-infile = 0  automatic_sp_privileges = 0  safe-user-create = 1  secure-auth = 1  secure-file-priv = /tmp  symbolic-links = 0  key_buffer_size = 32M  myisam-recover = BACKUP,FORCE  concurrent_insert = 2  max_allowed_packet = 16M  max_connect_errors = 1000000  datadir = /var/lib/mysql  tmp_table_size = 64M  max_heap_table_size = 64M  query_cache_type = 1  query_cache_size = 0  query_cache_limit = 8M  query_cache_min_res_unit = 1K  default-storage-engine = InnoDB  thread_stack = 256K  thread_cache_size = 128  max_connections = 128  open_files_limit = 65535  skip-locking  skip-bdb  server-id  = 1  log_bin = /var/log/mysql/mysql-bin.log  binlog_cache_size = 256K  sync_binlog  = 256  expire_logs_days  = 14  max_binlog_size = 1G  binlog_do_db  = include_database_name  binlog_ignore_db = include_database_name  max_user_connections = 150  key_buffer = 16M  key_cache_block_size     = 4K  bulk_insert_buffer_size  = 8M  innodb_buffer_pool_size  = 6G  myisam_sort_buffer_size  = 64M  join_buffer_size = 8M  read_buffer_size = 2M  sort_buffer_size = 3M  read_rnd_buffer_size = 64M  table_cache = 4096  table_definition_cache = 4096  table_open_cache = 16384  optimizer_search_depth   = 4  collation-server = utf8_general_ci  interactive_timeout = 400  wait_timeout = 300  connect_timeout = 10  thread_concurrency=8  back_log = 2048  open-files = 10000  query_prealloc_size = 65536  query_alloc_block_size = 131072  

Run Multiple Remote Jobs

Posted: 03 Apr 2013 04:01 PM PDT

I need to manually run a job on more than 150 sql server instances (sql server 2000, remote) from a sql server 2005 instance (the local server). The job is the same on all these instances. The job just calls a stored procedure without parameter, which is also the same across all the instances. These jobs are on a schedule. But now they want me to manually run the job for all the instance or for specified instances upon request.

What is the best practice for this? I have tried openrowset to call the remote stored procedure. But each run of the job takes couple of minutes, so if I use a loop to run all these jobs, it will run one by one and that's a long time. Ideally, it should be able to run the stored procedure on each instance without waiting for it to finish. More ideally, it should be able to run the job on each instance without waiting for it to finish, so it can leave a record in the job history on each instance.

And the stored procedure is from a third party so it can't be altered.

update:

since the 'people' ask this to be initialised from a SSRS report, use SSRS to call some T-SQL/proc on the server would be most appropriate. The problem I got now is when calling msdb.dbo.sp_start_job on remote SQL Server 2000 instances from local server using OPENQUERY or OPENROWSET I got Cannot following error.

process the object "exec msdb.dbo.sp_start_job @job_name = 'xxx' ". The OLE DB provider "SQLNCLI" for linked server "xxx" indicates that either the object has no columns or the current user does not have permissions on that object.

I guess this may because the sp_start_job doesn't return anything because I can use OPENQUERY/OPENROWSET to call other remote proc without problem. So any workaround?

update:

I have found it actually pretty simple in t-sql.

EXEC [linkedServerName].msdb.dbo.sp_start_job @job_name = 'test2'

So I don't need to use OPENROWSET/OPENQUERY atually since all the remote SQL Server 2000 instances are already added as remote servers.

FETCH API_CURSOR causing open transaction in tempDB

Posted: 03 Apr 2013 02:59 PM PDT

A select statement run from Oracle 11gR1 to SQL Server 2005 using Gateway leaves an open transaction in tempdb. The user is a datareader in Database XYZ. Not sure why the open tran is in tempDB when the statement run is SELECT.

Any one had seen this issue before ?

Thanks in advance sekhar

Cannot find MySQL Server Config Wizard for MySQL 5.6 build

Posted: 03 Apr 2013 12:36 PM PDT

Not sure if this is the right Stack Exchange site for this but...

I am trying to reconfigure my mySQL instance but cannot find the config wizard. I looked here: http://dev.mysql.com/doc/refman/5.0/en/mysql-config-wizard-starting.html

As one user pointed out, the config .exe file is not included in version above 5.5. Does anyone know how to reconfigure the MySQL server instance?

How export a sql server 2008 diagram to PDF filetype?

Posted: 03 Apr 2013 11:01 AM PDT

I want to have a export from my database diagram to PDF or image types.How can I do this?
I worked with sql server 2008 R2.

MySQL Benchmark on FreeBSD and Ubuntu

Posted: 03 Apr 2013 01:59 PM PDT

I'd like to benchmark two db servers, one is running Ubuntu the other FreeBSD. Unfortunately we are unable to install SysBench on the FreeBSD OS. Could anyone recommend any alternatives? Or should I look into creating some intense queries based on the schema?

What permissions are needed to view dependencies?

Posted: 03 Apr 2013 11:02 AM PDT

My company is moving to a much more restrictive security model in the near future.

As part of that we are changing specific servers to the new model and testing all our existing processes. One of the processes that I use a lot utilizes the SMO framework to script out database objects in dependency order, so we can run those scripts on another server.

The dependency order piece is very important as we have a lot of nested objects (think views that reference other views).

With the new security model in place, this script stopped working correctly.

I tested in SSMS as well (SSMS 2012 against a 2008r2 instance) and using View Dependencies on a view that references another view in the same DB doesn't show the referenced view under Objects on which [this view] depends.

Even more troubling, if I run sp_depends on the view in question I do get an accurate list of dependencies.

I did a bit of research and couldn't find a definitive answer, so I'm hoping someone can help me out:

What specific permissions are needed for a user to accurately view dependencies in SQL Server 2008r2 and/or SQL Server 2012 (we are upgrading soon).

[MS SQL Server] Installing SQL server 2008 Express from windows application

[MS SQL Server] Installing SQL server 2008 Express from windows application


Installing SQL server 2008 Express from windows application

Posted: 16 Jul 2012 09:44 PM PDT

HiI am trying to install SQL Server 2008 Express from Window ApplicationI am using this command[code="other"]"/qs /ACTION=Install /SkipRules=VSShellInstalledRule RebootRequiredCheck /HIDECONSOLE /FEATURES=SQL /INSTANCENAME=" + _InstanceName + " /SECURITYMODE=\"SQL\" /SQLSVCACCOUNT=\"NT AUTHORITY\\SYSTEM\" /SAPWD=" + _SAPassword + " /SQLSYSADMINACCOUNTS=\"BUILTIN\\ADMINISTRATORS\" /ENABLERANU=1 /AGTSVCACCOUNT=\"NT AUTHORITY\\SYSTEM\" /TCPENABLED=1 /ERRORREPORTING=1";[/code]It gives some error[code="plain"]Overall summary: Final result: SQL Server installation failed. To continue, investigate the reason for the failure, correct the problem, uninstall SQL Server, and then rerun SQL Server Setup. Exit code (Decimal): -2068643839 Exit facility code: 1203 Exit error code: 1 Exit message: SQL Server installation failed. To continue, investigate the reason for the failure, correct the problem, uninstall SQL Server, and then rerun SQL Server Setup. Start time: 2012-07-17 15:24:44 End time: 2012-07-17 15:31:36 Requested action: Install Log with failure: C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\Log\20120717_152357\sql_engine_core_inst_Cpu32_1_3.log Exception help link: http://go.microsoft.com/fwlink?LinkId=20476&ProdName=Microsoft+SQL+Server&EvtSrc=setup.rll&EvtID=50000&ProdVer=10.0.4000.0[/code]This is the log[code="plain"]=== Verbose logging started: 7/17/2012 15:31:31 Build type: SHIP UNICODE 5.00.7601.00 Calling process: C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\Release\x86\setup100.exe ===MSI (c) (88:9C) [15:31:31:134]: Resetting cached policy valuesMSI (c) (88:9C) [15:31:31:134]: Machine policy value 'Debug' is 0MSI (c) (88:9C) [15:31:31:134]: ******* RunEngine: ******* Product: l:\ba31376b3b92fbc8f394b3\x86\setup\sql_engine_core_inst_msi\sql_engine_core_inst.msi ******* Action: ******* CommandLine: **********MSI (c) (88:9C) [15:31:31:134]: Client-side and UI is none or basic: Running entire install on the server.MSI (c) (88:9C) [15:31:34:134]: Failed to grab execution mutex. System error 258.MSI (c) (88:9C) [15:31:34:135]: Cloaking enabled.MSI (c) (88:9C) [15:31:34:135]: Attempting to enable all disabled privileges before calling Install on ServerMSI (c) (88:9C) [15:31:34:136]: Incrementing counter to disable shutdown. Counter after increment: 0MSI (c) (88:9C) [15:31:34:137]: Decrementing counter to disable shutdown. If counter >= 0, shutdown will be denied. Counter after decrement: -1MSI (c) (88:9C) [15:31:34:137]: MainEngineThread is returning 1618=== Verbose logging stopped: 7/17/2012 15:31:34 ===MSI (s) (48:80) [15:31:49:560]: User policy value 'DisableRollback' is 0MSI (s) (48:80) [15:31:49:560]: Machine policy value 'DisableRollback' is 0MSI (s) (48:80) [15:31:49:560]: Incrementing counter to disable shutdown. Counter after increment: 0MSI (s) (48:80) [15:31:49:560]: Note: 1: 1402 2: HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Installer\Rollback\Scripts 3: 2 MSI (s) (48:80) [15:31:49:564]: Note: 1: 1402 2: HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Installer\Rollback\Scripts 3: 2 MSI (s) (48:80) [15:31:49:565]: Decrementing counter to disable shutdown. If counter >= 0, shutdown will be denied. Counter after decrement: -1MSI (s) (48:80) [15:31:49:565]: Restoring environment variablesMSI (s) (48:80) [15:31:49:573]: Destroying RemoteAPI object.MSI (s) (48:80) [15:31:49:573]: Custom Action Manager thread ending.[/code]Any info need let me knowThanksWith regardsDD

Logical CPU Count and Hyperthread Ratio ?

Posted: 02 Apr 2013 05:53 PM PDT

Hi,The query output as below[code="sql"]SELECT cpu_count AS [Logical CPU Count], hyperthread_ratio AS [Hyperthread Ratio],cpu_count/hyperthread_ratio AS [Physical CPU Count], physical_memory_in_bytes/1048576 AS [Physical Memory (MB)], sqlserver_start_timeFROM sys.dm_os_sys_info OPTION (RECOMPILE);[/code]Logical CPU Count - 16 Hyperthread Ratio - 8 Physical CPU Count - 2Physical Memory (MB) - 32756 Hyperthread Technology is create two logical processor on actual physical processor, So server has 2 Physical CPU, it means Dual core processor. Hyperthread Ratio is 8, 8*2=16 logical processor.I want to know, Does Logical CPU Count and Hyperthread Ratio should be same numbers?thanksananda

Entering Service Account Details During Install

Posted: 02 Apr 2013 09:16 AM PDT

I read an article posted by GilaMonster about remedies for some errors preventing SQL from starting.One section was on the Service Account being locked out.This reminded me of a time when the Service Account got locked out during a new SQL installation here.The incorrect password was entered for the respective services and the Service Account got locked out during the authentication attempts (depends on the password policy of course) -- so all all the production SQL services would now have a locked account and would not be able to be restarted.So when you perform new installs of SQL do you enter the main prod service account at this point in the installation or do you just use the local system account to get SQL installed, and then use SQL Configuration Manager to change the service accounts post install? This way you can at least do one at a time therefore reducing the chances of a locked account!Just curious.thanks

Upgrade SSAS to a lower edition

Posted: 02 Apr 2013 06:26 AM PDT

I currently have SSAS 2008 (Ent)Is it possible to upgrade it to SSAS 2008 R2 Standard. I know from Version and Edition Upgrades http://technet.microsoft.com/en-us/library/ms143393(v=sql.105).aspx that SQL 2008 engine Enterprise upgrade path is only Enterprise or Datacenter is this the same for Analysis services?

[Articles] Data Darwinism

[Articles] Data Darwinism


Data Darwinism

Posted: 02 Apr 2013 11:00 PM PDT

Data could be the way that more decisions are made, separating the competent from the incompetent in the future. Steve Jones isn't sure this is the best way to make decisions if we don't include a human element in the decision process.

[SQL 2012] SQL Server 2012 - SSIS issue

[SQL 2012] SQL Server 2012 - SSIS issue


SQL Server 2012 - SSIS issue

Posted: 02 Apr 2013 12:02 PM PDT

HiWe installed SQL Server 2012 SP1 on Windows 2008R2(VM) and have setup DNS aliases for our servers.I am an admin on the Windows server with SA access to the SQL server. [u]Current Issues:[b][/b][/u]I am able to connect to both Database Engine and Integration services using the Server Name.I am able to connect to the Database Engine but not to the Integration Services using the Alias.I have tried adding user explicitly to the Microsoft SQL Server Integration Services 11.0 on DCOM Config and restarted SSIS.Added DisableLoopbackCheck, modified to 1 and have rebooted the server, still the error persists. [b]Below is the error message: [u][/u][/b]Cannot connect to <AliasName>.===================================Failed to retrieve data for this request. (Microsoft.SqlServer.Management.Sdk.Sfc)------------------------------For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%20SQL%20Server&LinkId=20476------------------------------Program Location: at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.Process(Object connectionInfo, Request request) at Microsoft.SqlServer.Management.SqlStudio.Explorer.ObjectExplorerService.ValidateConnection(UIConnectionInfo ci, IServerType server) at Microsoft.SqlServer.Management.UI.ConnectionDlg.Connector.ConnectionThreadUser()===================================Connecting to the Integration Services service on the computer "AliasName" failed with the following error: "Access is denied."By default, only administrators have access to the Integration Services service. On Windows Vista and later, the process must be running with administrative privileges in order to connect to the Integration Services service. See the help topic for information on how to configure access to the service.------------------------------For help, click: http://go.microsoft.com/fwlink/?LinkId=220763------------------------------Program Location: at Microsoft.SqlServer.Dts.Runtime.Application.GetServerInfo(String server, String& serverVersion) at Microsoft.SqlServer.Dts.SmoEnum.DTSEnum.GetData(EnumResult erParent) at Microsoft.SqlServer.Management.Sdk.Sfc.Environment.GetData() at Microsoft.SqlServer.Management.Sdk.Sfc.Environment.GetData(Request req, Object ci) at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.GetData(Object connectionInfo, Request request) at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.Process(Object connectionInfo, Request request)===================================Connecting to the Integration Services service on the computer "AliasName" failed with the following error: "Access is denied."By default, only administrators have access to the Integration Services service. On Windows Vista and later, the process must be running with administrative privileges in order to connect to the Integration Services service. See the help topic for information on how to configure access to the service.------------------------------Program Location: at Microsoft.SqlServer.Dts.Runtime.Wrapper.ApplicationClass.GetServerInfo(String bstrMachineName, String& serverVersion) at Microsoft.SqlServer.Dts.Runtime.Application.GetServerInfo(String server, String& serverVersion)Any insight would be helpful.

Sql server 2012 performance benchmark

Posted: 02 Oct 2012 06:36 PM PDT

Hi all,I'm looking for Sql Server 2012 new features about performance, I read this (http://www.sqlservercentral.com/blogs/discussionofsqlserver/2012/04/19/sql-server-2012-performance-test-gap-detection/) useful article and what's new whitepaper and also I found this test :http://www.tpc.org/tpce/results/tpce_perf_results.aspDoes anyone have benchmark about performance test result in Sql server 2012 vs 2008 R2 ?

Temp DB DQL server isnt seeing all data files

Posted: 02 Apr 2013 03:35 AM PDT

Our Temp DB isn't seeing all all the data files we have a total of 8. SQL server is only seeing 2 after restart last night.Any help would be appreciated.Thanks

abrupt SQL Server restarts

Posted: 26 Feb 2013 08:18 PM PST

I have recently installed a SQL Server 2012 instance and have deployed a few databases on it.However, the server is restaring abruptly and nothing is clear from SQL Server logs.I see some wierd messages in event viewer like the below ones:Product: SQL Server 2012 Database Engine Services - Update '{402234FB3-7241-4B6D-83A4-92323BCS3DC8}' could not be installed. Error code 1642. Windows Installer can create logs to help troubleshoot issues with installing software packages. Use the following link for instructions on turning on logging support: http://go.microsoft.com/fwlink/?LinkId=23127Windows detected your registry file is still in use by other applications or services. The file will be unloaded now. The applications or services that hold your registry file may not function properly afterwards. DETAIL - 2 user registry handles leaked from \Registry\User\S-1-5-80-1651743498233-604422593-37299121234-4324324703093-311232196992:Process 6056 (\Device\HarddiskVolume3\Windows\System32\conhost.exe) has opened key \REGISTRY\USER\S-1-5-80-16517234234233-12334442-3729906164-4251703093-3124596992\Control Panel\InternationalProcess 6044 (\Device\HarddiskVolume4\Program Files\Microsoft SQL Server\MSSQL11.DEV\MSSQL\Binn\fdhost.exe) has opened key \REGISTRY\USER\S-1-5-80-1651231233-6021312593-31232139906164-42556757573093-3124596992\Control Panel\InternationalWhats the reason for the SQL restart? Would a Service pack upgrade help?

[T-SQL] Calculate number of days missed per term

[T-SQL] Calculate number of days missed per term


Calculate number of days missed per term

Posted: 02 Apr 2013 04:38 AM PDT

Need to update a new column (DaysMissed).Trying to calculate the number of days missed per term (accumulative) based on the term end date.Terms table contains the EndDate of each Term per LocID for each year.TestScores contains the Test taken with the score and what Term it was taken.Attend contains the PeriodNBR and the date (DateABS (date absent)). One day missed equals 8 periods, so the calculation is: DaysMissed = COUNT(DateABS)/8. In other words how many sets of 8 periods missed by the Term EndDate. This table is empty at the begining of a school year.Table creates, ddl, sample data:[code="sql"]CREATE TABLE TestScores( SchoolYR CHAR(9) not null, ID INT not null, LocID CHAR(4) not null, TestName VARCHAR(30) not null, Term TINYINT not null, DaysMissed TINYINT not null, Score decimal(6,2) not null )GOINSERT TestScores (SchoolYR, ID, LocID, TestName, Term, DaysMissed, Score) VALUES('2012-2013', 414, '355', 'CSA 1-2', 1, 0, 21.00),('2012-2013', 414, '355', 'CSA 2-2', 2, 0, 23.00),('2012-2013', 414, '355', 'CSA 3-1', 3, 0, 16.00),('2012-2013', 414, '355', 'CSA 4-1', 4, 0, 16.00),('2012-2013', 414, '355', 'CSA 4-2', 4, 0, 23.00),('2012-2013', 450, '355', 'CSA 1-1', 1, 0, 17.00),('2012-2013', 450, '355', 'CSA 2-3', 2, 0, 16.00),('2012-2013', 450, '355', 'CSA 3-2', 3, 0, 17.00),('2012-2013', 450, '355', 'CSA 3-3', 3, 0, 16.00),('2012-2013', 450, '355', 'CSA 4-1', 4, 0, 15.00),('2012-2013', 450, '355', 'CSA 4-2', 4, 0, 17.00),('2012-2013', 450, '355', 'CSA 4-3', 4, 0, 16.00),('2012-2013', 450, '355', 'CSA 4-4', 4, 0, 23.00),('2012-2013', 450, '355', 'CSA 3-1', 3, 0, 16.00),('2012-2013', 450, '355', 'CSA 1-2', 1, 0, 23.00),('2012-2013', 451, '355', 'CSA 1-1', 1, 0, 15.00),('2012-2013', 451, '355', 'CSA 2-2', 2, 0, 17.00),('2012-2013', 451, '355', 'CSA 2-3', 2, 0, 16.00),('2012-2013', 451, '355', 'CSA 2-4', 2, 0, 23.00),('2012-2013', 451, '355', 'CSA 3-1', 3, 0, 15.00),('2012-2013', 451, '355', 'CSA 3-2', 3, 0, 17.00),('2012-2013', 451, '355', 'CSA 3-3', 3, 0, 16.00),('2012-2013', 451, '355', 'CSA 3-4', 3, 0, 23.00),('2012-2013', 451, '355', 'CSA 4-1', 4, 0, 15.00),('2012-2013', 451, '355', 'CSA 4-2', 4, 0, 17.00),('2012-2013', 451, '355', 'CSA 4-3', 4, 0, 16.00),('2012-2013', 451, '355', 'CSA 4-4', 4, 0, 23.00),('2012-2013', 717, '344', 'CSA 4-1', 4, 0, 15.00),('2012-2013', 717, '344', 'CSA 4-2', 4, 0, 17.00),('2012-2013', 717, '344', 'CSA 4-3', 4, 0, 16.00),('2012-2013', 717, '344', 'CSA 4-4', 4, 0, 23.00),('2012-2013', 922, '344', 'CSA 4-1', 4, 0, 15.00),('2012-2013', 922, '344', 'CSA 4-2', 4, 0, 16.00),('2012-2013', 2735, '344', 'CSA 4-1', 4, 0, 15.00),('2012-2013', 2735, '344', 'CSA 4-2', 4, 0, 21.00),('2012-2013', 2735, '344', 'CSA 4-3', 4, 0, 15.00),('2012-2013', 4343, '355', 'CSA 4-2', 4, 0, 16.00),('2012-2013', 4343, '355', 'CSA 4-3', 4, 0, 23.00),('2012-2013', 6831, '344', 'CSA 4-3', 4, 0, 16.00),('2012-2013', 6831, '344', 'CSA 4-4', 4, 0, 23.00),('2012-2013', 8343, '355', 'CSA 4-1', 4, 0, 16.00),('2012-2013', 8343, '355', 'CSA 4-2', 4, 0, 23.00),('2012-2013', 9831, '344', 'CSA 4-2', 4, 0, 16.00),('2012-2013', 9831, '344', 'CSA 4-3', 4, 0, 23.00)GOCREATE TABLE Terms( SchoolYR CHAR(9) not null, LocID CHAR(4) not null, Term TINYINT not null, EndDate DATE not null)GOINSERT Terms (SchoolYR, Term, LocID, EndDate) VALUES('2012-2013', 1, '355', '2012-10-12'),('2012-2013', 2, '355', '2012-12-20'),('2012-2013', 3, '355', '2013-03-07'),('2012-2013', 4, '355', '2013-05-24'),('2012-2013', 1, '344', '2012-10-12'),('2012-2013', 2, '344', '2012-12-20'),('2012-2013', 3, '344', '2013-03-07'),('2012-2013', 4, '344', '2013-05-24')GOCREATE TABLE Attend( ID INT not null, PeriodNBR Tinyint not null, DateABS SMALLDATETIME not null)GOINSERT Attend(ID, PeriodNBR, DateABS)VALUES(410, 8, '2013-01-03 00:00:00'),(414, 8, '2013-01-03 00:00:00'),(414, 7, '2013-01-03 00:00:00'),(414, 6, '2013-01-03 00:00:00'),(414, 5, '2013-01-03 00:00:00'),(414, 4, '2013-01-03 00:00:00'),(414, 3, '2013-01-03 00:00:00'),(414, 2, '2013-01-03 00:00:00'),(414, 1, '2013-01-03 00:00:00'),(414, 6, '2012-12-19 00:00:00'),(414, 5, '2012-12-19 00:00:00'),(414, 6, '2012-12-12 00:00:00'),(414, 5, '2012-12-12 00:00:00'),(414, 4, '2012-12-12 00:00:00'),(414, 3, '2012-12-12 00:00:00'),(414, 2, '2012-12-12 00:00:00'),(414, 7, '2012-11-14 00:00:00'),(414, 7, '2012-08-15 00:00:00'),(450, 8, '2013-03-21 00:00:00'),(450, 7, '2013-03-21 00:00:00'),(450, 1, '2013-03-15 00:00:00'),(450, 2, '2013-03-14 00:00:00'),(450, 1, '2013-03-14 00:00:00'),(450, 8, '2013-03-12 00:00:00'),(450, 7, '2013-03-12 00:00:00'),(450, 6, '2013-03-12 00:00:00'),(450, 5, '2013-03-12 00:00:00'),(450, 4, '2013-03-12 00:00:00'),(450, 3, '2013-03-12 00:00:00'),(450, 2, '2013-03-12 00:00:00'),(450, 1, '2013-03-12 00:00:00'),(450, 1, '2013-03-04 00:00:00'),(450, 1, '2013-02-19 00:00:00'),(450, 1, '2013-02-04 00:00:00'),(450, 1, '2013-01-31 00:00:00'),(450, 8, '2013-01-29 00:00:00'),(450, 7, '2013-01-29 00:00:00'),(450, 6, '2013-01-29 00:00:00'),(450, 5, '2013-01-29 00:00:00'),(450, 4, '2013-01-29 00:00:00'),(450, 3, '2013-01-29 00:00:00'),(450, 2, '2013-01-29 00:00:00'),(450, 1, '2013-01-29 00:00:00'),(450, 8, '2013-01-28 00:00:00'),(450, 7, '2013-01-28 00:00:00'),(450, 6, '2013-01-28 00:00:00'),(450, 5, '2013-01-28 00:00:00'),(450, 4, '2013-01-28 00:00:00'),(450, 3, '2013-01-28 00:00:00'),(450, 2, '2013-01-28 00:00:00'),(450, 1, '2013-01-28 00:00:00'),(450, 8, '2013-01-25 00:00:00'),(450, 7, '2013-01-25 00:00:00'),(450, 6, '2013-01-25 00:00:00'),(450, 5, '2013-01-25 00:00:00'),(450, 4, '2013-01-25 00:00:00'),(450, 3, '2013-01-25 00:00:00'),(450, 2, '2013-01-25 00:00:00'),(450, 1, '2013-01-25 00:00:00'),(450, 1, '2013-01-24 00:00:00'),(450, 1, '2013-01-22 00:00:00'),(450, 1, '2013-01-14 00:00:00'),(450, 1, '2012-12-18 00:00:00'),(450, 8, '2012-12-14 00:00:00'),(450, 7, '2012-12-14 00:00:00'),(450, 6, '2012-12-14 00:00:00'),(450, 5, '2012-12-14 00:00:00'),(450, 4, '2012-12-14 00:00:00'),(450, 3, '2012-12-14 00:00:00'),(450, 2, '2012-12-14 00:00:00'),(450, 1, '2012-12-14 00:00:00'),(450, 8, '2012-11-30 00:00:00'),(450, 7, '2012-11-30 00:00:00'),(450, 6, '2012-11-30 00:00:00'),(450, 5, '2012-11-30 00:00:00'),(450, 2, '2012-11-16 00:00:00'),(450, 1, '2012-11-16 00:00:00'),(450, 8, '2012-11-09 00:00:00'),(450, 7, '2012-11-09 00:00:00'),(450, 6, '2012-11-09 00:00:00'),(450, 5, '2012-11-09 00:00:00'),(450, 4, '2012-11-09 00:00:00'),(450, 3, '2012-11-09 00:00:00'),(450, 2, '2012-11-09 00:00:00'),(450, 1, '2012-11-09 00:00:00'),(450, 1, '2012-11-01 00:00:00'),(450, 1, '2012-10-30 00:00:00'),(450, 7, '2012-10-24 00:00:00'),(450, 6, '2012-10-24 00:00:00'),(450, 5, '2012-10-24 00:00:00'),(450, 1, '2012-10-24 00:00:00'),(450, 8, '2012-10-16 00:00:00'),(450, 7, '2012-10-16 00:00:00'),(450, 6, '2012-10-16 00:00:00'),(450, 5, '2012-10-16 00:00:00'),(450, 4, '2012-10-16 00:00:00'),(450, 3, '2012-10-16 00:00:00'),(450, 2, '2012-10-16 00:00:00'),(450, 1, '2012-10-16 00:00:00'),(450, 7, '2012-10-10 00:00:00'),(450, 8, '2012-10-09 00:00:00'),(450, 7, '2012-10-09 00:00:00'),(450, 6, '2012-10-09 00:00:00'),(450, 5, '2012-10-09 00:00:00'),(450, 4, '2012-10-09 00:00:00'),(450, 3, '2012-10-09 00:00:00'),(450, 2, '2012-10-09 00:00:00'),(450, 1, '2012-10-09 00:00:00'),(450, 8, '2012-10-08 00:00:00'),(450, 7, '2012-10-08 00:00:00'),(450, 6, '2012-10-08 00:00:00'),(450, 5, '2012-10-08 00:00:00'),(450, 4, '2012-10-08 00:00:00'),(450, 3, '2012-10-08 00:00:00'),(450, 2, '2012-10-08 00:00:00'),(450, 1, '2012-10-08 00:00:00'),(450, 8, '2012-10-05 00:00:00'),(450, 7, '2012-10-05 00:00:00'),(450, 6, '2012-10-05 00:00:00'),(450, 5, '2012-10-05 00:00:00'),(450, 4, '2012-10-05 00:00:00'),(450, 3, '2012-10-05 00:00:00'),(450, 2, '2012-10-05 00:00:00'),(450, 1, '2012-10-05 00:00:00'),(450, 8, '2012-10-04 00:00:00'),(450, 7, '2012-10-04 00:00:00'),(450, 6, '2012-10-04 00:00:00'),(450, 5, '2012-10-04 00:00:00'),(450, 4, '2012-10-04 00:00:00'),(450, 3, '2012-10-04 00:00:00'),(450, 2, '2012-10-04 00:00:00'),(450, 1, '2012-10-04 00:00:00'),(450, 8, '2012-10-03 00:00:00'),(450, 7, '2012-10-03 00:00:00'),(450, 6, '2012-10-03 00:00:00'),(450, 5, '2012-10-03 00:00:00'),(450, 4, '2012-10-03 00:00:00'),(450, 3, '2012-10-03 00:00:00'),(450, 2, '2012-10-03 00:00:00'),(450, 1, '2012-10-03 00:00:00'),(450, 8, '2012-10-02 00:00:00'),(450, 7, '2012-10-02 00:00:00'),(450, 6, '2012-10-02 00:00:00'),(450, 5, '2012-10-02 00:00:00'),(450, 4, '2012-10-02 00:00:00'),(450, 3, '2012-10-02 00:00:00'),(450, 2, '2012-10-02 00:00:00'),(450, 1, '2012-10-02 00:00:00'),(450, 8, '2012-10-01 00:00:00'),(450, 7, '2012-10-01 00:00:00'),(450, 6, '2012-10-01 00:00:00'),(450, 5, '2012-10-01 00:00:00'),(450, 4, '2012-10-01 00:00:00'),(450, 3, '2012-10-01 00:00:00'),(450, 2, '2012-10-01 00:00:00'),(450, 1, '2012-10-01 00:00:00'),(450, 1, '2012-09-18 00:00:00'),(450, 7, '2012-09-05 00:00:00'),(450, 6, '2012-09-05 00:00:00'),(450, 8, '2012-08-24 00:00:00'),(450, 7, '2012-08-24 00:00:00'),(451, 8, '2013-03-21 00:00:00'),(451, 7, '2013-03-21 00:00:00'),(451, 8, '2013-03-18 00:00:00'),(451, 7, '2013-03-18 00:00:00'),(451, 6, '2013-03-18 00:00:00'),(451, 5, '2013-03-18 00:00:00'),(451, 4, '2013-03-18 00:00:00'),(451, 3, '2013-03-18 00:00:00'),(451, 2, '2013-03-18 00:00:00'),(451, 1, '2013-03-18 00:00:00'),(451, 1, '2013-03-15 00:00:00'),(451, 2, '2013-03-14 00:00:00'),(451, 1, '2013-03-14 00:00:00'),(451, 1, '2013-02-19 00:00:00'),(451, 8, '2013-02-11 00:00:00'),(451, 7, '2013-02-11 00:00:00'),(451, 6, '2013-02-11 00:00:00'),(451, 5, '2013-02-11 00:00:00'),(451, 4, '2013-02-11 00:00:00'),(451, 3, '2013-02-11 00:00:00'),(451, 2, '2013-02-11 00:00:00'),(451, 1, '2013-02-11 00:00:00'),(451, 8, '2013-02-04 00:00:00'),(451, 7, '2013-02-04 00:00:00'),(451, 6, '2013-02-04 00:00:00'),(451, 5, '2013-02-04 00:00:00'),(451, 4, '2013-02-04 00:00:00'),(451, 3, '2013-02-04 00:00:00'),(451, 2, '2013-02-04 00:00:00'),(451, 1, '2013-02-04 00:00:00'),(451, 1, '2013-01-31 00:00:00'),(451, 2, '2013-01-28 00:00:00'),(451, 1, '2013-01-28 00:00:00'),(451, 1, '2013-01-24 00:00:00'),(451, 1, '2013-01-22 00:00:00'),(451, 8, '2013-01-16 00:00:00'),(451, 7, '2013-01-16 00:00:00'),(451, 6, '2013-01-16 00:00:00'),(451, 5, '2013-01-16 00:00:00'),(451, 4, '2013-01-16 00:00:00'),(451, 3, '2013-01-16 00:00:00'),(451, 2, '2013-01-16 00:00:00'),(451, 1, '2013-01-16 00:00:00'),(451, 8, '2013-01-15 00:00:00'),(451, 7, '2013-01-15 00:00:00'),(451, 6, '2013-01-15 00:00:00'),(451, 5, '2013-01-15 00:00:00'),(451, 4, '2013-01-15 00:00:00'),(451, 3, '2013-01-15 00:00:00'),(451, 2, '2013-01-15 00:00:00'),(451, 1, '2013-01-15 00:00:00'),(451, 1, '2013-01-14 00:00:00'),(451, 8, '2012-12-14 00:00:00'),(451, 7, '2012-12-14 00:00:00'),(451, 6, '2012-12-14 00:00:00'),(451, 5, '2012-12-14 00:00:00'),(451, 4, '2012-12-14 00:00:00'),(451, 3, '2012-12-14 00:00:00'),(451, 2, '2012-12-14 00:00:00'),(451, 1, '2012-12-14 00:00:00'),(451, 1, '2012-12-04 00:00:00'),(451, 8, '2012-11-30 00:00:00'),(451, 7, '2012-11-30 00:00:00'),(451, 6, '2012-11-30 00:00:00'),(451, 5, '2012-11-30 00:00:00'),(451, 7, '2012-11-28 00:00:00'),(451, 6, '2012-11-28 00:00:00'),(451, 5, '2012-11-28 00:00:00'),(451, 7, '2012-11-27 00:00:00'),(451, 8, '2012-11-26 00:00:00'),(451, 7, '2012-11-26 00:00:00'),(451, 6, '2012-11-26 00:00:00'),(451, 5, '2012-11-26 00:00:00'),(451, 4, '2012-11-26 00:00:00'),(451, 3, '2012-11-26 00:00:00'),(451, 2, '2012-11-26 00:00:00'),(451, 1, '2012-11-26 00:00:00'),(451, 2, '2012-11-16 00:00:00'),(451, 1, '2012-11-16 00:00:00'),(451, 8, '2012-11-02 00:00:00'),(451, 7, '2012-11-02 00:00:00'),(451, 6, '2012-11-02 00:00:00'),(451, 5, '2012-11-02 00:00:00'),(451, 4, '2012-11-02 00:00:00'),(451, 3, '2012-11-02 00:00:00'),(451, 2, '2012-11-02 00:00:00'),(451, 1, '2012-11-02 00:00:00'),(451, 8, '2012-11-01 00:00:00'),(451, 7, '2012-11-01 00:00:00'),(451, 6, '2012-11-01 00:00:00'),(451, 1, '2012-11-01 00:00:00'),(451, 1, '2012-10-30 00:00:00'),(451, 7, '2012-10-26 00:00:00'),(451, 1, '2012-10-24 00:00:00'),(451, 7, '2012-10-10 00:00:00'),(451, 8, '2012-10-09 00:00:00'),(451, 7, '2012-10-09 00:00:00'),(451, 6, '2012-10-09 00:00:00'),(451, 5, '2012-10-09 00:00:00'),(451, 4, '2012-10-09 00:00:00'),(451, 3, '2012-10-09 00:00:00'),(451, 2, '2012-10-09 00:00:00'),(451, 1, '2012-10-09 00:00:00'),(451, 8, '2012-10-08 00:00:00'),(451, 7, '2012-10-08 00:00:00'),(451, 6, '2012-10-08 00:00:00'),(451, 5, '2012-10-08 00:00:00'),(717, 8, '2013-01-03 00:00:00'),(717, 7, '2013-01-03 00:00:00'),(717, 6, '2013-01-03 00:00:00'),(717, 5, '2013-01-03 00:00:00'),(717, 4, '2013-01-03 00:00:00'), (717, 3, '2013-01-03 00:00:00'),(717, 2, '2013-01-03 00:00:00'),(717, 1, '2013-01-03 00:00:00'),(717, 8, '2012-12-19 00:00:00')GO[/code] Here is code I've been trying to combine without any luck:[code="sql"]select ID, DaysMissed = COUNT(DateABS)/8from Attend where DateABS <= '2013-05-24' --the Terms.EndDategroup by IDorder by ID[/code][code="sql"]select s.ID, s.TestName, s.Term, s.Daysmissed,t.EndDatefrom TestScores as sjoin Terms as ton s.SchoolYR = t.SchoolYRand s.Term = t.Termand s.LocID = t.LocIDwhere s.Term in (1, 2, 3, 4)[/code]The days missed is accumulative per the school year.The desired output should look like this:[code="plain"] ID TestName Term Daysmissed EndDate414 CSA 1-2 1 0 2012-10-12414 CSA 2-2 2 1 2012-12-20414 CSA 3-1 3 2 2013-03-07414 CSA 4-1 4 2 2013-05-24414 CSA 4-2 4 2 2013-05-24450 CSA 1-1 1 7 2012-10-12450 CSA 2-3 2 12 2012-12-20450 CSA 3-2 3 16 2013-03-07450 CSA 3-3 3 16 2013-03-07450 CSA 4-1 4 17 2013-05-24450 CSA 4-2 4 17 2013-05-24450 CSA 4-3 4 17 2013-05-24450 CSA 4-4 4 17 2013-05-24450 CSA 3-1 3 16 2013-03-07450 CSA 1-2 1 7 2012-10-12451 CSA 1-1 1 1 2012-10-12451 CSA 2-2 2 6 2012-12-20451 CSA 2-3 2 6 2012-12-20451 CSA 2-4 2 6 2012-12-20451 CSA 3-1 3 11 2013-03-07451 CSA 3-2 3 11 2013-03-07451 CSA 3-3 3 11 2013-03-07451 CSA 3-4 3 11 2013-03-07451 CSA 4-1 4 13 2013-05-24451 CSA 4-2 4 13 2013-05-24451 CSA 4-3 4 13 2013-05-24451 CSA 4-4 4 13 2013-05-24717 CSA 4-1 4 1 2013-05-24717 CSA 4-2 4 1 2013-05-24717 CSA 4-3 4 1 2013-05-24717 CSA 4-4 4 1 2013-05-24922 CSA 4-1 4 0 2013-05-24922 CSA 4-2 4 0 2013-05-242735 CSA 4-1 4 0 2013-05-242735 CSA 4-2 4 0 2013-05-242735 CSA 4-3 4 0 2013-05-244343 CSA 4-2 4 0 2013-05-244343 CSA 4-3 4 0 2013-05-246831 CSA 4-3 4 0 2013-05-246831 CSA 4-4 4 0 2013-05-248343 CSA 4-1 4 0 2013-05-248343 CSA 4-2 4 0 2013-05-249831 CSA 4-2 4 0 2013-05-249831 CSA 4-3 4 0 2013-05-24[/code]Thanks,Sqlraider

Insert into new table based on conditions

Posted: 02 Apr 2013 11:42 PM PDT

Hi all,I'm struggling with a query which runs forever and I can't figure out what the problem is.In attachment you will find a file to create and fill 3 tables to test.I will try to explain what I have to accomplish.I have to select data from 2 tables (Application_Useraccounts and Application_Useraccounts_Status) and insert it into a 3rd table (Tussen_Statussen_Adviseur). The column Useraccount_Status_Duration from the table Application_Useraccounts_Status has to be inserted into the correct column in the table Tussen_Statussen_Adviseur based on the ID in the column Useraccount_Status_ID from the table Application_Useraccounts_Status.For example:If the ID is 1, then the Useraccount_Status_Duration should go into column Duration_UserLogin.If the ID is 2, then the Useraccount_Status_Duration should go into column Duration_PauzeKort.If the ID is 3, then the Useraccount_Status_Duration should go into column Duration_PauzeLang.etc...An example as result can be found in the table Tussen_Statussen_Adviseur that will be created in the test file.At this moment I have the following query, but it runs forever and does not end.[code="sql"]--===Insert the result from the select into the new table.===INSERT INTO Tussen_Statussen_Adviseur (Useraccount_Name, Status_DateTime, Duration_UserLogin, Duration_PauzeKort, Duration_PauzeLang, Duration_Beschikbaar, Duration_Storing, Duration_Werkoverleg, Duration_NietBeschikbaar, Duration_Overdracht, Duration_Beheer) SELECT (COALESCE(dtSums.Useraccount_First_Name, '') + ' ' + COALESCE(dtSums.Useraccount_Last_Name, '')) AS Useraccount_Name, dtSums.Useraccount_Status_DateTime, dtSums.Duration_UserLogin, dtSums.Duration_PauzeKort, dtSums.Duration_PauzeLang, dtSums.Duration_Beschikbaar, dtSums.Duration_Storing, dtSums.Duration_Werkoverleg, dtSums.Duration_NietBeschikbaar, dtSums.Duration_Overdracht, dtSums.Duration_Beheer FROM ( --===Derived table finds the duration in the column Useraccount_Status_Duration and puts it in the desired column based on the ID in the column Useraccount_Status_ID.=== SELECT UA.Useraccount_First_Name, UA.Useraccount_Last_Name, US.Useraccount_Status_DateTime, US1.Useraccount_Status_Duration AS Duration_UserLogin, US2.Useraccount_Status_Duration AS Duration_PauzeKort, US3.Useraccount_Status_Duration AS Duration_PauzeLang, US4.Useraccount_Status_Duration AS Duration_Beschikbaar, US5.Useraccount_Status_Duration AS Duration_Storing, US6.Useraccount_Status_Duration AS Duration_Werkoverleg, US7.Useraccount_Status_Duration AS Duration_NietBeschikbaar, US8.Useraccount_Status_Duration AS Duration_Overdracht, US9.Useraccount_Status_Duration AS Duration_Beheer FROM Application_Useraccounts_Status AS US LEFT OUTER JOIN Application_Useraccounts AS UA ON US.Useraccount_ID = UA.Useraccount_ID LEFT OUTER JOIN Application_Useraccounts_Status AS US1 ON US1.Useraccount_ID = UA.Useraccount_ID AND US1.Useraccount_Status_ID = 1 LEFT OUTER JOIN Application_Useraccounts_Status AS US2 ON US1.Useraccount_ID = UA.Useraccount_ID AND US2.Useraccount_Status_ID = 2 LEFT OUTER JOIN Application_Useraccounts_Status AS US3 ON US1.Useraccount_ID = UA.Useraccount_ID AND US3.Useraccount_Status_ID = 3 LEFT OUTER JOIN Application_Useraccounts_Status AS US4 ON US1.Useraccount_ID = UA.Useraccount_ID AND US4.Useraccount_Status_ID = 4 LEFT OUTER JOIN Application_Useraccounts_Status AS US5 ON US1.Useraccount_ID = UA.Useraccount_ID AND US5.Useraccount_Status_ID = 9 LEFT OUTER JOIN Application_Useraccounts_Status AS US6 ON US1.Useraccount_ID = UA.Useraccount_ID AND US6.Useraccount_Status_ID = 10 LEFT OUTER JOIN Application_Useraccounts_Status AS US7 ON US1.Useraccount_ID = UA.Useraccount_ID AND US7.Useraccount_Status_ID = 11 LEFT OUTER JOIN Application_Useraccounts_Status AS US8 ON US1.Useraccount_ID = UA.Useraccount_ID AND US8.Useraccount_Status_ID = 12 LEFT OUTER JOIN Application_Useraccounts_Status AS US9 ON US1.Useraccount_ID = UA.Useraccount_ID AND US9.Useraccount_Status_ID = 13 ) AS dtSums[/code]Can someone tell what I'm doing wrong and where I have to correct the query to run it in a good way and that will give me the desired result?Thank you very much for your help and assistance.Greetz,Geert

difference betwwn inner join and intersect in sql

Posted: 24 Mar 2013 07:56 PM PDT

I would like to know when to use inner join/outer join and intersect/except as they probably are doing the same job if not mistaken.what is difference while working with joins and using these keyword intersect/except.

how to rename columns that you are unpivoting on?

Posted: 02 Apr 2013 09:05 AM PDT

I've successfully unpivoted Months01 through Months02 with a query that looks like this:[code="plain"]/****** Script for SelectTopNRows command from SSMS ******/SELECT [FullAccount] ,[BusinessUnit] ,[ObjAcct] ,[Sub] ,[LT] ,[Century] ,[FY] ,Period ,sum(Qty) as ValueFROM (SELECT [FullAccount] ,[BusinessUnit] ,[ObjAcct] ,[Sub] ,[LT] ,[Century] ,[FY] ,[Month01] ,[Month02] ,[Month03] ,[Month04] ,[Month05] ,[Month06] ,[Month07] ,[Month08] ,[Month09] ,[Month10] ,[Month11] ,[Month12] ,[Month13] ,[Month14] ,[Month15] ,[Month16] ,[Month17] ,[Month18] ,[Month19] ,[Month20] ,[Month21] ,[Month22] ,[Month23] ,[Month24] ,[FYPlus2] ,[FYPlus3] ,[FYPlus4] ,[Source] ,[Memo1] ,[Memo2] ,[lastchgby] ,[lastchgdate] ,[FiscalYear] ,[ForecastType] ,[CurrentForecast] FROM [ForecastTemplate].[dbo].[Allocations]) T unpivot (Qty FOR Period IN ( [Month01] ,[Month02] ,[Month03] ,[Month04] ,[Month05] ,[Month06] ,[Month07] ,[Month08] ,[Month09] ,[Month10] ,[Month11] ,[Month12] ,[Month13] ,[Month14] ,[Month15] ,[Month16] ,[Month17] ,[Month18] ,[Month19] ,[Month20] ,[Month21] ,[Month22] ,[Month23] ,[Month24] ,[FYPlus2] ,[FYPlus3] ,[FYPlus4]) ) as u group by [FullAccount] ,[BusinessUnit] ,[ObjAcct] ,[Sub] ,[LT] ,[Century] ,[FY] ,Period order by period [/code]How can I rename the columns I am unpivoting on? I tried 'AS' statement in the unpivot portion but got an error....please see if you can offer suggestions without DDL. Otherwise, I will try to post DDL later. Thanks much

TSQL CTE Insert

Posted: 02 Apr 2013 11:35 AM PDT

Hello,I was sure I had asked this before but I can't find that post so my apologies for the repeat question. I need to do an insert into two tables X times. X is TBD from a control table. Let's just say three times. So I need the same record inserted 3 times except for one field that will be whatever X is. Table A & B have a relationship. For example:INSERT TableA (ColumnA, ColumnB, ColumnX)VALUES (ValueA, ValueB, NULL)INSERT TableB (ColumnA, ColumnB, ColumnC)VALUES (SCOPE_IDENTITY, ValueB, ValueC)Now I need to do this three time for our example and I don't want RBAR. I believe you do this with a join or a cte I think. Not sure what to brush up on exactly.So this would get me my loop valuesDECLARE @Loop TABLE (RepeatCount INT)INSERT @Loop SELECT NumRepeats FROM ControlTable WHERE ID = 123So I would then use a CTE to join the inserts with the @Loop correct?JB

LTRIM,RTRIM

Posted: 21 Oct 2011 12:29 AM PDT

hii have this code in where clause RTRIM(LTRIM(ODER_ID))= ' ',WHAT THis do

Stored Procedure Records

Posted: 02 Apr 2013 03:09 AM PDT

I have created a stored procedure that will return approved purchase orders when executed. However, if I have it kickoff several times a day I do not want to return purchase order records that were returned on the previous executions. Any ideas on the best way to set this up? I thought about putting the data over into a temp table to do the comparisons or maybe have it look at time stamp info. This will actually be kicked off from a SharePoint workflow. Any help or suggestions will be greatly appreciated.

bcp output of table variable

Posted: 02 Apr 2013 05:20 AM PDT

DECLARE @tempOutput TABLE (vend_num NVARCHAR(7) NULL ,name NVARCHAR(60) NULL ,pay_type NCHAR(1) NULL ,check_num INT NULL ,date DATETIME NULL ,check_date DATETIME NULL ,)DECLARE @cmd VARCHAR(2048) SET @cmd = 'bcp ' +' "SELECT * FROM @tempOutput WHERE @Severity = 0" ' + ' queryout ' +' "c:\temp\query.txt" ' +' -T -c -t^|' EXEC master..xp_cmdshell @cmd, NO_OUTPUTThis does not create a file.How do i get this to work ?

Improve performance of query involving linked server

Posted: 02 Apr 2013 04:20 AM PDT

I am trying to make improvements to a job that does the following.1) Load a table with about 40,000 rows (Pretty straight forward, so there isn't much to change here)2) Call a stored proc that joins this table's data with big set of data from a linked server. This proc reads data from the above table (where one of the columns is flagged '0'), and does a lot of gimmicks to process it and sets the flag to '1'3) This stored proc is coded such that it takes a parameter for the number of rows to be processed. Currently the call is made with @rowsToProcess as 250. The code looks as below...WHILE EXISTS (SELECT 1 FROM MyTable WHERE MyFlag = 0)BEGIN EXEC dbo.MyProc @Server = 'MYREMOTESQLSERVER', @DBName = 'MYRemoteDBN', @rowsToProcess = 250ENDCan anyone suggest if there is a better idea to run this more efficiently? For now, I only have room to play around how to call this proc to process all rows in "MyTable". In other words, I can't influence how data is organized/arranged in the tables on the linked server.Thanks in advance,Rex

retrieve records by joining history table

Posted: 02 Apr 2013 03:40 AM PDT

HI,Can you help me with the query. Thanks for any help. The below is the example scenario:Master_tablepk status_date status amount1 1/1/13 approved 1002 12/1/12 denied 03 11/1/11 in progress 2004 10/1/12 approved 300Status History tablepk pk_master_table status_date status1 1 11/30/12 in progress3 1 12/10/2 under review5 1 1/1/13 approved2 2 10/1/12 in progress4 2 11/1/12 under review6 2 12/1/12 denied7 3 11/1/11 in progress8 4 9/1/12 in progress9 4 9/15/12 under review10 4 10/1/12 approvedThe query will have an input: Status date, example 11/15/12Output required: All the master_table records on their snap shot status <= 'input date' - As of the input date, what was the status of the recordThe below will be the output of the queryAPPROVED1 record - pk is 4DENIED0 recordIN PROGRESS1 record - pk is 3 - note: pk 4 does not show up in this list as the 'approved date' is the higher, though the history has 'in progress' less than input 11/15UNDER REVIEW1 record - pk is 2Thanks,

Search This Blog