Thursday, June 13, 2013

[SQL server issues] Commented Issue: AdventureWorksDW2008R2 DimDate table has missing dates [13383]


Commented Issue: AdventureWorksDW2008R2 DimDate table has missing dates [13383]



pavanbarnala wrote Feb 23, 2012 at 10:47 AM


Hello,
Is this been taken care? If so, please share the latest data for DimDate. I'm working on some POC which requires dates in between 2008 and 2010 as well. If it is not already done, can someone let me know is there is any way I can generate the data for those missing dates in that table?



petermyers wrote Feb 26, 2012 at 5:21 AM


The attached script will fix the problem. Use the stored procedure to extend to future dates if needed.



.

sqlserversamples.codeplex.com

[SQL server issues] Created Issue: SQL Server 2008 Developer edition sample database install error [18174]


Created Issue: SQL Server 2008 Developer edition sample database install error [18174]








description


I'm trying to install Adventureworks sample dB and i get an error.
I successfully downloaded the mdf file however when attach, add and click OK from SSMS I get the error:
 
Msg 5172, Level 16, State 15, Line 1
The header for file 'C:\Program Files\Microsoft SQL Server\AdventureWorks2012_Data.mdf' is not a valid database file header. The FILE SIZE property is incorrect.







.

sqlserversamples.codeplex.com

[SQL server issues] Created Issue: AdventureWorksDW2012 Download file corrupt [19063]


Created Issue: AdventureWorksDW2012 Download file corrupt [19063]


There seams to be an issue with the AdventureWorksDW2012 Data file download. When I try to run this SQL statement:

CREATE DATABASE AdventureWorksDW2012 ON (FILENAME = 'E:\MSSQL11.SQL02\MSSQL\DATA\AdventureWorksDW2012_Data.mdf') FOR ATTACH_REBUILD_LOG;

This is the error I get:

Msg 5172, Level 16, State 15, Line 1
The header for file 'E:\MSSQL11.SQL02\MSSQL\DATA\AdventureWorksDW2012_Data.mdf' is not a valid database file header. The FILE SIZE property is incorrect.

Thanks,Dan


.

sqlserversamples.codeplex.com

[SQL Server Data Warehousing] In SSIS package How can we call a Mcro


In SSIS package How can we call a Mcro



Hi All


I have a scenario to call a macro from SSIS  Can any one help me with the steps How to Add the PIA lib and refrence in Script task, i have done some work but iam getting fallowing error


Error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.InvalidCastException: Unable to cast COM object of type 'System.__ComObject' to class type 'Microsoft.Office.Interop.Excel.ApplicationClass'. COM components that enter the CLR and do not support IProvideClassInfo or that do not have any interop assembly registered will be wrapped in the __ComObject type. Instances of this type cannot be cast to any other class; however they can be cast to interfaces as long as the underlying COM component supports QueryInterface calls for the IID of the interface.


   at ST_9f928eae9e0d4fb3b1e14b45d2f15b3c.vbproj.ScriptMain.Main()


   --- End of inner exception stack trace ---


   at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)


   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)


   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)


   at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams)


   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTATaskScriptingEngine.ExecuteScript()


Thanks



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] SSAS Cube not taking the data


SSAS Cube not taking the data



Please let us know if


1. You are getting any errors


2. How frequently is data updated in your view and for how long you are using the cube?


3. Have u scheduled for process of cube regularly



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Collecting data from multiple sites


Collecting data from multiple sites



Your question has two different parts;


1- archiving existing databases


2- Integrating ,cleansing, de-duplicating (maybe), data warehousing, ETL , Cube


for the first part there are many articles published for archiving SQL Server databases. for example:


http://www.mssqltips.com/sqlservertip/1121/archiving-data-in-sql-server/


I am going to answer the second part of your question, which is more related to Data Warehousing and BI;


First of all you need to integrate data from those several hundred locations into an integrated database (this can be called as integrating or staging area). SSIS is good data transfer tool which helps to a lot in transferring data from those sources into the integrated database. you can use different methods for transferring data in SSIS, use FTP, exported CSV files, connect to SQL Server databases (if live connection exists), web services ,....


You will also need to design the data warehouse, Kimball is one of the most common methodologies of designing data warehouse based on business requirements. you can use The Data Warehouse Toolkit book from Kimball to learn about Dimensional Modeling.


After designing the data warehouse you will require an ETL (Extract Transform Tool) to get data from the integrated database (or staging) into the data warehouse. SSIS again is very powerful ETL tool. it will provide massive amount of data transformation components for using in ETL process.


For data cleansing; I recommend using Data Quality Services (DQS) which is part of SQL Server 2012 services. with this service you can create knowledge bases for different business domains, and somebody from business will play role of Data Steward to maintain that knowledge base, In SSIS there is a DQS Cleansing component which talks with DQS and apply data quality rules of knowledge bases to incoming data stream in ETL process, and as a result you will have more cleansed data.


For data consistency and governance; The best way to keep data consistency is a master data management tool, Microsoft tool for MDM is Master Data Services (MDS). this tool is part of SQL Server installation as well, you will create data models with entities, relationship between entities, and business rules. MDS provides only ONE SINGLE VERSION OF TRUTH for your data structure, this means that you can have only single version of customer information. or only single version of product info. this will help data consistency.


Finally for working with OLAP Cubes; you can use Analysis Services (SSAS) to build olap cubes on your data warehouse.


and if you want to create reports and dashboards from your cube; there are many data visualization tools in Microsoft stack, such as SSRS, PerformancePoint, Power View, Excel and Excel Services



http://www.rad.pasfu.com



.

social.technet.microsoft.com/Forums

[MS SQL Server] Fragmented pages in tables not changing

[MS SQL Server] Fragmented pages in tables not changing


Fragmented pages in tables not changing

Posted: 13 Jun 2013 05:24 AM PDT

I have several tables in a database that shows a high percentage in fragmentation. I created a maintenance plan and runs succesfully however, the fragmentation is not changing. For example the table below has a 52% fragmentation and pagecount of 109,208 but the rebuild index task doesn't change it.objectname objectid indexid partitionnum frag pagecountpricevalue 832955731 1 1 52.0331600907694 109288I'm just wondering if there's anything I need to do or that I'm doing wrong.Thank you.

SSIS Job - Intermittent Login Timeout Failures

Posted: 30 Nov 2010 11:48 PM PST

I have been running into some weird problem - I have SSIS jobs scheduled to run on a regular basis. Most of the time they run without any problem, but lately I got more and more sporadic job failures like the following message. The failure does not persist for long as the same job will run fine later without any intervention.Everything else on the server seems fine. I don't see any info in the sql error log. Any ideas? Thanks!------------Microsoft (R) SQL Server Execute Package Utility Version 10.0.2531.0 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 9:23:26 AM Could not load package "xxxxxxx" because of error 0xC0014062. Description: The LoadFromSQLServer method has encountered OLE DB error code 0x80004005 (Login timeout expired). The SQL statement that was issued has failed. Source: Started: 9:23:26 AM Finished: 9:23:42 AM Elapsed: 15.039 seconds. The package could not be loaded. The step failed.

Importing SSIS Package fails with Invalid characters

Posted: 30 Aug 2011 07:37 AM PDT

I've recently installed SQL Server 2008R2 with SP2 and SSIS. When we try import SSIS packages from SQL 2005 we get the following error: "Object name "䎹녞숋㤧 " is not valid. Name must contain only alphanumeric characters or underscores "_".Nothing unusual about the install, collation of SQL is Latin1_General_CI_AS.Package is going to the File System, Package name is Package1 and protection level is "Keep protection level of the original package"Has anyone seen anything like this. Any help would be appreciated.CheersLeo

Help with Plan Cache Query

Posted: 12 Jun 2013 12:29 PM PDT

I have had times when I would like to get the Query Plan from the Plan Cache.I have Query 1 below (MSDN) which give me the query text and last execution times but not the Query Plan.Or Query 2 below (also from MSDN) that give me the Query Plans but not the Last_execution times or other info like READ Write stats etc etcI am having to run a separate query to get this [SELECT * FROM sys.dm_exec_query_plan(plan_handle) ] using the returned Plan Handle.Is their any way to get everything in the one query? I have tried to cross apply sys.dm_exec_query_stats to Query 2 to give me the execution times but that was a guess and not right.thanks[b]Query 1[/b]SELECT dbid, sql_handle, (SELECT TOP 1 SUBSTRING(s2.text,statement_start_offset / 2+1 , ( (CASE WHEN statement_end_offset = -1 THEN (LEN(CONVERT(nvarchar(max),s2.text)) * 2) ELSE statement_end_offset END) - statement_start_offset) / 2+1)) AS sql_statement, execution_count, plan_generation_num, last_execution_time, total_worker_time, last_worker_time, min_worker_time, max_worker_time, total_physical_reads, last_physical_reads, min_physical_reads, max_physical_reads, total_logical_writes, last_logical_writes, min_logical_writes, max_logical_writes, plan_handleFROM sys.dm_exec_query_stats AS s1 CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2 WHERE s2.objectid is null ORDER BY s1.last_execution_time DESC[b]Query 2[/b]SELECT UseCounts, Cacheobjtype, Objtype, TEXT, query_plan,*FROM sys.dm_exec_cached_plans CROSS APPLY sys.dm_exec_sql_text(plan_handle)CROSS APPLY sys.dm_exec_query_plan(plan_handle)

Bakcup failure with fulltext catalog

Posted: 12 Jun 2013 09:18 PM PDT

Hi ,The fullbackup job failed with the below error :MessageExecuted as user: NT AUTHORITY \ SYSTEM. Allowed for a file or file group "XXXXXXX_event" for backup, because it is not online. You can use FILEGROUP or FILE clauses execute BACKUP, to limit the options include only online data. [SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failedI chcked the fulltext catalog file is on online.please let me know how to resolve this issue ..Thanks,

Enabled LPIM and disk's latency went down to zero?

Posted: 12 Jun 2013 06:21 AM PDT

Hi,I wanted to start a short discussion about Lock Pages in Memory (LPIM) and any possible impact (positive one) on disk's latency.Recently, I contacted MS to get some help on a Cluster's issue. Long history short, the case is still open but it looks it has been resolved. However, they did recommend the use of LPIM on my SQL box.Now, I am running a standard version of SQL 2012 on a two node Cluster. The Cluster does not lack of RAM (96GB per node, with 64GB assigned to SQL, dedicated box) But just today and after a few days of that change, I noticed on my RedGate Monitoring tool, that read and writes (latency) on my Data LUN went down to zero! Could be that such action put most of my data into RAM, improving latency in such a dramatic way? Or ... should I look for any RedGate patch / issue? It's just hard to believe that a simple change, made such a huge improvement.I know how LPIM works, and that LPIM will avoid the Os to trim any memory from SQL server, etc, but I find these results really outstanding and dramatic, especially on a dedicated box.

Search This Blog