Monday, August 19, 2013

[SQL 2012] SSRS 3.0 Parameters input issue

[SQL 2012] SSRS 3.0 Parameters input issue


SSRS 3.0 Parameters input issue

Posted: 18 Aug 2013 09:32 PM PDT

Hi all,I am building a report that will make a comparison with figures for the last 3 months, and last 12 months. For this I will build 3 datasets with the exact same structure, with the difference that I need each one of them the data will be different.The way I am trying to do it is to go to a specific dataset and on the parameters tab choose the parameter and do as example below:=Parameters!Period.Value-3 The parameter above would be for the dataset that needs to show the figures for the last quarter, the period field is an integer and every month as a number attached, so I figured that if for this specific dataset in the parameters tab there is an instruction for the parameter in the expression it should work... It doesn't, anyone has a suggestion for this?Best regards,Daniel

async_network_io high on reading data from SQL 2012 instance

Posted: 19 Aug 2013 02:08 AM PDT

Hi,after hearing some colleagues from development complain about "slow database" I started some investigations and noticed constantly report a high level of network_io waits over TCP/IP. Server Configuration: [SQL2012TEST]2x 6-core CPUs, HT enabled, 96GB of Ram, 10GbE (MTU 1500), X-cable 1 GbE (MTU 9000), RAID1, RAID5, JBOD and SSD drive configurations.SQL2012 SP1 11.0.3368Data Source Table: [dbo].[TestData] (bigint, datetime2(7), bigint, tinyint, float, int, int)Table: 106'847'023 rows @ ~42bytes/row, 7 columns, total bytes: 4'705'978'766[u]Initial test:[/u]running bcp from my workstation (1Gbe Network)bcp "dbo.TestData" out nul -n -T -SSQL2012TEST -dTESTDATA [b]Total Duration:[/b] 330sec[b]Avg MBytes/Rows per sec:[/b] 12.9 / 324'000Repeating the same test with a simple SqlDataReader implementation shows even worse results.[b]Total Duration:[/b] 398sec[b]Avg MBytes/Rows per sec:[/b] 10.7 / 267'000I can rule out disk issues as BCP is writing to NUL and SqlDataReader writes to BinaryWriter.Null. I can rule out network issues (so far), NtTTcp reports consistent throughput of 905MBit with almost zero errors/retransmits.The XE sessions (sqlos.waitinfo) tell me that there is only one wait-type: (ASYNC_)NETWORK_IO. Accumulated total wait time for the SPID running bcp[b]bcp:[/b] 160 sec[b]SqlDataReader:[/b] 220 secWell, I am wondering, what is the issue here? What is causing the high wait time, deserialization of the TDS stream?

Questions: SOS_SCHEDULER_YIELD, CPU Affinity, Core Licensing

Posted: 18 Aug 2013 08:32 PM PDT

Hello there, some of this post might be a bit rambling, but bear with me. If anyone wants further information let me know and I can dig it up.One of our clients servers is showing a reasonable amount of SOS_SCHEDULER_YIELD waits. It's the highest wait type present in the system at around 22% of total waits. They are clearly visible when I run an intensive query and take a delta of the waits stats. I've seen 17 seconds of SOS_SCHEDULER_YIELD waits reported for a 15 second period, correct me if I'm wrong but this basically equates to at least an entire core being idle for the entire operation. We also see a small amount of ASYNC_NETWORK_IO, but this is expected with the particular revision of the software.I'm trying to isolate why these waits are occurring, as it appears to be CPU contention slowing them down. There is plenty of headroom in the IO subsystem and we have about double the buffer pool that we realistically need. The vast majority of reads appear to be logcial, not physical. We don't see continually high CPU usage, it averages around 10-20% across the system in what is a 8 vCPU VMWare system. However, we've seen similar systems where increasing the amount of cores on the VM provides a noticeable performance benefit. The problem I have here is that most of our clients run with 4 cores, and we then suggest an upgrade to 8 cores if they are experiencing performance issues. We simply shouldn't be seeing CPU contention on this system.A DBA from one of our sites has emailed us the following advice:"Just another quick thought –with VM's we have found setting the processor affinity in MSSQL engine properties is a vital config for VM's and MSSQL. This means that MSSQL VM's will exclusively use the same processor for each thread rather than potentially bouncing it around like a yo yo across the available CPU resources that VMware manages at a much lower level, it can result in all sorts of odd cpu bottleneck issues and load issues like high cpu context switching, interrupt queues etc and so forth which are detrimental across the whole VM node."Now I've read up on this and it seems like a very bad idea to be considering, however, in the context of SQL Server running on VMWare, does that statement have any weight to it?For some background: At the back end the host is a dual 6 Core System with hyperthreading enabled. The SQL Server has 8 cores assigned and 2 other servers have 4 cores each assigned (total of 16), one of these servers (apart from the SQL server) might be applying reasonable pressure on the host. I'm inclined to suggest that we subscribe the hosts at 1:1 to eliminate competition for CPU resources on the host.I also have a query about core licensing. If a server is only licensed for 4 cores, but has 8 vCPU's allocated does SQL apply licensing restraints and only utilise 4 cores? Or do all of them get used. On the system in question I'm seeing 8 active schedulers, which seems to indicate the latter. However, i'm not 100% positive they've gone for Core licensing as opposed to Server+CAL.

Reusable component in SSIS/SSRS/SSAS

Posted: 18 Aug 2013 11:29 PM PDT

Hi Team,I want to know, do we have any re-usable components(like Macros,etc) for SSIS/SSRS/SSAS? If yes, please let me know. I'm intersted to know its functionality. Thanks.CheersJim

How do I use Analysis Services?

Posted: 18 Aug 2013 06:10 AM PDT

Hi,I have an Operational Datastore (ODS) instance consisting of several relational databases that are populated daily from exports from the OLTP databases. I use SSIS 2012 for this.The databases in the ODS are replicates of the databases in the OLTP.I also hold web analytics etc in the ODS and I will be combining data from all these sources for MI reporting and also for data analysis and visualization through a BI dashboard.There won't be much data to start off with, but I want to build the system for future use when there will be more data.Do I build a datawarehouse from the ODS databases, or do I build datamarts, or do I create logical views on top of the ODS databases that can be used by MI reporting and BI tools?I think performance at this point in time will be met by simple logical views. Is this wise or do I invest my time in designing a proper datawarehouse from the ODS now?Also, should I import data from the ODS into the datawarehouse using Analysis Services? Is this the correct use of it?Thanks for any help offered.

No comments:

Post a Comment

Search This Blog