Monday, April 8, 2013

[SQL Server 2008 issues] SQL Server 2008R2 Memory

[SQL Server 2008 issues] SQL Server 2008R2 Memory


SQL Server 2008R2 Memory

Posted: 07 Apr 2013 07:04 PM PDT

Good Day.We are running an SQL Server 2008R2 ent 64 Bit on a server with 200 gig of memory . At the beginning of the week SQL Server only uses 50 gigs of memory. By Wednes it uses 150 gig and remains there . No memory is ever released . Should we lock down the memory ? Regards.Lian

Index fragmentation

Posted: 07 Apr 2013 06:18 PM PDT

HI All,The table is not using the specified fillfactor after rebuild the index.Please find the attached doc, where i have mentioned my question.Thanks

Is it really unused index?

Posted: 07 Apr 2013 06:15 PM PDT

Hi,Can you confirm, Is it really unused index?UserSeek - 16951UserScans - 11803UserLookup- 0UserUpdates - 62992rgdsananda

Impersonate possible for a security group?

Posted: 21 Mar 2013 04:10 AM PDT

Hi everyone,I was thinking of publishing some Table Reports with an online connection to my MSSQL server.All of that works fine...Now I was hoping to use my security groups to manage the access to the database.When trying to use the security group as shown below it does not work - probably because that is only for users - however I was not able to find syntax for a group.GRANT IMPERSONATE ON USER::[my_domain\gordon] to [my_domain\tableau];Any advise is much appriciated.Cheers,Gordon

Pushing Data from T-SQL to Visual Studio While Debugging

Posted: 06 Apr 2013 07:52 PM PDT

I am debugging my T-SQL code in Visual Studio ... is it possible for me to write something in my Transact-SQL code and have it pushed in either the Output Window or Immediate Window of Visual Studio... if yes how do I do it?If I were writing a VB code for example it is simple .. I can just write debug.print "text write" ... and boom the text I want to end up in the Immediate Window gets there ... so how about in T-SQL being debugged in VS, how is it done?I have spent hours in books and google without getting any closer to the answer I am seeking and I am hoping people out there who have more experience would shed light and give me some enlightenment on this.This technique if possible would help me a lot in my debugging work. Thanks so much for all the help.Matt

200-101 testing and training materials

Posted: 06 Apr 2013 07:15 PM PDT

Cisco posed the question, "Does Home Cisco 200-101 collect my location?" and answered it as follows: "Cisco Home doesn't use location in any way that's different from the Cisco app you already have on your Android phone. You can learn about 200-101 testing and training materials how location works across Cisco in our Data Use Policy and Help Center."The data use policy states that the data Cisco collects can include Internet Protocol addresses and a user's location. "For example, we may get your GPS or other location information so we can tell you if any of your friends are nearby."It was not clear from Cisco's post whether 200-101 testing and training materials Home collects location data any more frequently than does the Cisco mobile app, and Cisco didn't immediately respond to a question about that sent late on Friday.Cisco has riled users in the past with respect to 200-101 testing and training materials their privacy. Last year it settled a class-action lawsuit over a feature called Sponsored Stories, which used people's photographs to suggest that they were endorsing companies' products.http://www.needking.com/200-101.html

[SQL Server Data Warehousing] A Problem in time


Hi Guys,


Im having a bit of a conundrum and am seeking some best practise advice.


I have a fact table for phone call periods on any given day with call duration as a measure, calls can go over midnight and my time dimension has a grain of minutes.


Im not sure how to note the calls going over midnight, given that the calls go over midnight the call will have to be split in 2 one running to midnight and the second spanning from midnight to call end.


The problem is that with my time dimension running in minutes this will have to end at 23.59 which will leave my meausre a minute short of accurate.


eg call date,      call start , call end, call duration


     02/04/2012  23:45       23:59         14


     03/04/2012 00:00       00:05          5


giving a misleading 19 minute call duration


alternatively


eg call date,      call start , call end, call duration

     02/04/2012  23:45       00:00         15


     03/04/2012 00:00       00:05          5


this is equally misleading as 00:00 refers to 03/04 instead of the 02/04  and the call will have ended 23hrs 45 minutes before it started.


if i add a call enddate of 03/04/2012 to the fact table that still invalidates the grain of the table of call periods per day.


any advice will be greatly appreciated.


ill also be happy to clarify any ambiguities.


BB



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] CDC function should generate an error, but doesn't


When I execute this following code, the null as the second parameter to the get_all_changes function should generate an error back to the user.  However, it doesn't.  It just returns an empty result set.  If I don't have a where clause, it does generate the expected error.  Does anyone know why this behaves in this manner?



SELECT *
FROM cdc.fn_cdc_get_all_changes_dbo_Table1(0x01, null, 'all')
WHERE
CAST(sys.fn_cdc_is_bit_set(1, __$update_mask) AS int) >0

Thanks,


Jason



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Data replicated from a Table to a Table across database


Hi,


We have two database, From First database we have to copy the table values and insert into table which is in Second database. Since there is no way(like modified date etc) in the table we can't find which record is added/modified/removed from the first table.


Presently we have a batch job to truncate all the records in the second table and insert all the records from the First table. I am not convinced with this way because this is a huge table and frequently we end up deleting the records and inserting and we make the sql server busy. Instead i need better way to handle this scenario. Do we have any other better way to handle this.


Thanks in advance.



badri prasad, Software Engineer



.

social.technet.microsoft.com/Forums

Sunday, April 7, 2013

[SQL Server Data Warehousing] Where to find best practices for tuning data warehouse ETL queries?


Hi Everybody,


Where can I find some good educational material on tuning ETL procedures for a data warehouse environment?  Everything I've found on the web regarding query tuning seems to be geared only toward OLTP systems.  (For example, most of our ETL queries don't use a WHERE statement, so the vast majority of searches are table scans and index scans, whereas most index tuning sites are striving for index seeks.)


I have read Microsoft's "Best Practices for Data Warehousing with SQL Server 2008R2," but I was only able to glean a few helpful hints that don't also apply to OLTP systems:


  • often better to recompile stored procedure query plans in order to eliminate variances introduced by parameter sniffing (i.e., better to use the right plan than to save a few seconds and use a cached plan SOMETIMES);

  • partition tables that are larger than 50 GB;

  • use minimal logging to load data precisely where you want it as fast as possible;

  • often better to disable non-clustered indexes before inserting a large number of rows and then rebuild them immdiately afterward (sometimes even for clustered indexes, but test first);

  • rebuild statistics after every load of a table.

But I still feel like I'm missing some very crucial concepts for performant ETL development.


BTW, our office uses SSIS, but only as a glorified stored procedure execution manager, so I'm not looking for SSIS ETL best practices.  Except for a few packages that pull from source systems, the majority of our SSIS packages consist of numerous "Execute SQL" tasks.


Thanks, and any best practices you could include here would be greatly appreciated.


-Eric



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Replace function ineffective with the string more than 90 chars


Hi Deva,


You are trying to replace  comma(,) with ',' 


I guess , you are trying like this



where geography in ('India','USA','Pakistan')

REPLACE function - http://msdn.microsoft.com/en-us/library/ms186862.aspx


"If string_expression is not of type varchar(max) or nvarchar(max),REPLACE truncates the return value at 8,000 bytes. To return values greater than 8,000 bytes, string_expression must be explicitly cast to a large-value data type "


So what happens if passed more than 90 characters truncation error or not getting replaced ??


I guess the problem will be because of single quotes not properly enclosed in your dynamic query


make sure you are adding quotes properly in  the begining and end of the values of @Geography


sathya --------- Please Mark as answered if my post solved your problem and Vote as helpful if my post was useful.



.

social.technet.microsoft.com/Forums

[SQL Server] Advanced SQL Server 2008 Extended Events with Examples



Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.



SQLTeam.com Articles via RSS


SQLTeam.com Weblog via RSS



.

sqlteam.com

Search This Blog