Showing posts with label SQL Server Data Warehousing. Show all posts
Showing posts with label SQL Server Data Warehousing. Show all posts

Friday, June 14, 2013

[SQL Server Data Warehousing] Query Statistics History Report Duration and Execution Time taken


Query Statistics History Report Duration and Execution Time taken



Hi everyone


I have about 4 jobs calling the same stored procedure but each time it runs it executes different Id's. When Job1 is running it may takes 45 minutes for job to complete, Job2 can take 1 hour 20 minutes to run. They basically overlap with each other but touch different Ids that need to be executed.


Questions 1


The stored procedure creates a bunch of #temp tables and I believe each time a job runs every time it goes through the same code and create seperate #temp tables. Like Job1 runs it creates #temp1_1, Job2 runs and creates #temp_2, Job3 runs and creates #temp_3 in memory.


Can you please confirm if my understanding is right.


Question 2


When I check the Query statistics History report I see server queries mentioned in the stored procedure running for hours, but the jobs completes in less than an hour. I am saying hours thinking that time mentioned there is in seconds. First of all is the total duration, execution/minutes is that in Milli seconds or Seconds?


So how is it possible that query shows 6 hours running but none of the job runs for 6 hours straight.


Thanks much


-Sarah




.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Best way to change the datatype of Column of Table


Best way to change the datatype of Column of Table



Are you allowed any sort of outage on this table?  If so, then consider creating a new table with the correct data type, bulk load the data into it, then drop the old table and rename the new.  I don't think 5m records is a significant enough volume to warrant any more complex a solution. 


You can also do an ALTER TABLE/ALTER COLUMN query, as long as the data types you are converting from and to are compatible, and the data is compatible.  That's assuming the column is not part of the primary key or part of an index or constraint. 



.

social.technet.microsoft.com/Forums

Thursday, June 13, 2013

[SQL Server Data Warehousing] In SSIS package How can we call a Mcro


In SSIS package How can we call a Mcro



Hi All


I have a scenario to call a macro from SSIS  Can any one help me with the steps How to Add the PIA lib and refrence in Script task, i have done some work but iam getting fallowing error


Error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.InvalidCastException: Unable to cast COM object of type 'System.__ComObject' to class type 'Microsoft.Office.Interop.Excel.ApplicationClass'. COM components that enter the CLR and do not support IProvideClassInfo or that do not have any interop assembly registered will be wrapped in the __ComObject type. Instances of this type cannot be cast to any other class; however they can be cast to interfaces as long as the underlying COM component supports QueryInterface calls for the IID of the interface.


   at ST_9f928eae9e0d4fb3b1e14b45d2f15b3c.vbproj.ScriptMain.Main()


   --- End of inner exception stack trace ---


   at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)


   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)


   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)


   at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams)


   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTATaskScriptingEngine.ExecuteScript()


Thanks



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] SSAS Cube not taking the data


SSAS Cube not taking the data



Please let us know if


1. You are getting any errors


2. How frequently is data updated in your view and for how long you are using the cube?


3. Have u scheduled for process of cube regularly



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Collecting data from multiple sites


Collecting data from multiple sites



Your question has two different parts;


1- archiving existing databases


2- Integrating ,cleansing, de-duplicating (maybe), data warehousing, ETL , Cube


for the first part there are many articles published for archiving SQL Server databases. for example:


http://www.mssqltips.com/sqlservertip/1121/archiving-data-in-sql-server/


I am going to answer the second part of your question, which is more related to Data Warehousing and BI;


First of all you need to integrate data from those several hundred locations into an integrated database (this can be called as integrating or staging area). SSIS is good data transfer tool which helps to a lot in transferring data from those sources into the integrated database. you can use different methods for transferring data in SSIS, use FTP, exported CSV files, connect to SQL Server databases (if live connection exists), web services ,....


You will also need to design the data warehouse, Kimball is one of the most common methodologies of designing data warehouse based on business requirements. you can use The Data Warehouse Toolkit book from Kimball to learn about Dimensional Modeling.


After designing the data warehouse you will require an ETL (Extract Transform Tool) to get data from the integrated database (or staging) into the data warehouse. SSIS again is very powerful ETL tool. it will provide massive amount of data transformation components for using in ETL process.


For data cleansing; I recommend using Data Quality Services (DQS) which is part of SQL Server 2012 services. with this service you can create knowledge bases for different business domains, and somebody from business will play role of Data Steward to maintain that knowledge base, In SSIS there is a DQS Cleansing component which talks with DQS and apply data quality rules of knowledge bases to incoming data stream in ETL process, and as a result you will have more cleansed data.


For data consistency and governance; The best way to keep data consistency is a master data management tool, Microsoft tool for MDM is Master Data Services (MDS). this tool is part of SQL Server installation as well, you will create data models with entities, relationship between entities, and business rules. MDS provides only ONE SINGLE VERSION OF TRUTH for your data structure, this means that you can have only single version of customer information. or only single version of product info. this will help data consistency.


Finally for working with OLAP Cubes; you can use Analysis Services (SSAS) to build olap cubes on your data warehouse.


and if you want to create reports and dashboards from your cube; there are many data visualization tools in Microsoft stack, such as SSRS, PerformancePoint, Power View, Excel and Excel Services



http://www.rad.pasfu.com



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Change Data Capture - Pros/Cons/Gotchas


Change Data Capture - Pros/Cons/Gotchas



My company is thinking of implementing Change Data Capture. I find many online articles, which cover the basics of Change Data Capture, but no real-life examples of using it, the pros and cons, limitations or "gotchas" encountered during actual implementation and usage, etc. I could use such info as we determine whether Change Data Capture is the right solution for our needs.


Some details about our scenario:


1. Source data: We have currently about 200 entities that use one of our applications. The entities involved are all functioning units of a larger, single business entity, but are spread out geographically. Each entity and its copy of the application stores its data in a separate database, although all copies of the database have identical data structures and tables. They post summarative data on a routine basis to a couple of main servers, however, in each case this data is still stored in separate database on the servers - thus, still about 200 databases.
2. What we would like to do: Bring together this summarative data into a single relational data store, and from there into a dimensional database. We do not want to, nor can we, eliminate the source databases, on either the geographically-distributed sites or the main servers, b/c we don't "own" them.


Is this a feasible scenario for Change Data Capture? We would need to include data column(s) to indicate data sources b/c the data, although placed in a single database, must still be identifiable by data source and data from one source never overwrites data from a different source. As I understand it, Change Data Capture would need to be enabled on the 200+ databases on the main servers, from which we would then likely use SSIS to process the data to post it to the new database we would develop.


I would appreciate any advice, based on understanding and experience, of important things to consider, things to avoid, "gotchas", best practices, etc. We have worked quite a bit with SSIS, developing dimensional data structures, etc., so although any major comments there are welcomed, our primary "need to know" is how/if Change Data Capture can be used for our scenario, as we have no experience with it.

Thanks for your assistance,
Randy



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] SCOM & SCCM how it integrates with its SQL database??


SCOM & SCCM how it integrates with its SQL database??



Microsoft is conducting an online survey to understand your opinion of the Technet Web site. If you choose to participate, the online survey will be presented to you when you leave the Technet Web site.


Would you like to participate?



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] How do delete a part data from partition


How do delete a part data from partition



We want to delete data (switch partitions) older than 2006 from a partitioned table. There is 2003, 2004, 2005 data that we want to switch, this comes to about 1.5 Billion rows.


While working on switching partitions, found out that up until 20071229, all the data is in one partition,


select @partition.TF_TBL_Fact_Store_Satles_trans(20071229) is 1. and switching partition, will switchout data up until this point.  I need to retain 2006 and 2007 data.


What is the best approach to this?


2006 and 2007 data is about 1.05 billion rows and older than 2006 is about 1.4 billiion rows(need to get rid of this 1.4B data and keep 2006/7).


How best can I handle this situation? Any inputs is much appreciated.


Thanks,


Suresh.



Suresh Channamraju



.

social.technet.microsoft.com/Forums

Monday, June 10, 2013

[SQL Server Data Warehousing] T-SQL SCD2


T-SQL SCD2



Here is the query i am using in case if it helps.


To give you little more scenario  on my request  : Our Data warehouse is very traditional approach of ETL , We have update statments that does lookup and update the Foriegn keys.


My first update statments goes and finds an employee based on a logic of Data Entry



Update dbo.SFWP
Set EmployeeID = we2.EmployeeSK
from dbo.SFWP a
Inner Join WMDM.dbo.WMDM AT
on a.AssignedTaskID = AT.AssignedTaskID
Inner join WMDM.dbo.Employee we1
on AT.EmployeeID = we1.SK
Inner Join WMDM.dbo.Employee we2
on we1.NK = we2.NK
and Src_EntryDate >= we2.EffectiveBeginDTM and
Src_EntryDate <= we2.EffectiveEndDTM

Apparently for some reason some employees cannot be matched because  our employee table has this condition not met because of missing records


and  Src_EntryDate >= we2.EffectiveBeginDTM and
Src_EntryDate <= we2.EffectiveEndDTM


So, my new query i am trying to take a previous or Next record and find employee ID and update it.



wITH CustCTE as
(
select
We2.SK,
We2.NK,
We2.EffectiveBeginDTM,
We2.EffectiveEndDTM,
ROW_NUMBER() OVER (PARTITION BY We2.NK ORDER BY We2.EffectiveBeginDTM) RowNum
from dbo.SFWP a
Inner Join WMDM.dbo.WMDM AT
on a.AssignedTaskID = AT.AssignedTaskID
Inner join WMDM.dbo.Employee we1
on AT.EmployeeID = we1.SK
Inner Join WMDM.dbo.Employee we2
on we1.NK = we2.NK
Where
Src_EntryDate < we2.EffectiveBeginDTM or
Src_EntryDate > we2.EffectiveEndDTM
and a.EmployeeID is NULL
)
,
CustCTE1 as (
Select
a.SK
,a.NK
,a.EffectiveBeginDTM
,a.EffectiveEndDTM
,a.RowNum

From CustCTE a
Union All
Select
Curr.SK
,Curr.NK
,Curr.EffectiveBeginDTM
,Curr.EffectiveEndDTM
,Curr.RowNum

From CustCTE Curr
Inner Join CustCTE1 Prev
on Curr.NK = Prev.NK
and Curr.RowNum = Prev.RowNum-1
)
select * From CustCTE1

But i am not getting the same thing as i expect..


Any help in this matter will be highly appreciated.


thank you in advance



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] SSAS Cube not taking the data


SSAS Cube not taking the data



Please let us know if


1. You are getting any errors


2. How frequently is data updated in your view and for how long you are using the cube?


3. Have u scheduled for process of cube regularly



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Collecting data from multiple sites


Collecting data from multiple sites



Your question has two different parts;


1- archiving existing databases


2- Integrating ,cleansing, de-duplicating (maybe), data warehousing, ETL , Cube


for the first part there are many articles published for archiving SQL Server databases. for example:


http://www.mssqltips.com/sqlservertip/1121/archiving-data-in-sql-server/


I am going to answer the second part of your question, which is more related to Data Warehousing and BI;


First of all you need to integrate data from those several hundred locations into an integrated database (this can be called as integrating or staging area). SSIS is good data transfer tool which helps to a lot in transferring data from those sources into the integrated database. you can use different methods for transferring data in SSIS, use FTP, exported CSV files, connect to SQL Server databases (if live connection exists), web services ,....


You will also need to design the data warehouse, Kimball is one of the most common methodologies of designing data warehouse based on business requirements. you can use The Data Warehouse Toolkit book from Kimball to learn about Dimensional Modeling.


After designing the data warehouse you will require an ETL (Extract Transform Tool) to get data from the integrated database (or staging) into the data warehouse. SSIS again is very powerful ETL tool. it will provide massive amount of data transformation components for using in ETL process.


For data cleansing; I recommend using Data Quality Services (DQS) which is part of SQL Server 2012 services. with this service you can create knowledge bases for different business domains, and somebody from business will play role of Data Steward to maintain that knowledge base, In SSIS there is a DQS Cleansing component which talks with DQS and apply data quality rules of knowledge bases to incoming data stream in ETL process, and as a result you will have more cleansed data.


For data consistency and governance; The best way to keep data consistency is a master data management tool, Microsoft tool for MDM is Master Data Services (MDS). this tool is part of SQL Server installation as well, you will create data models with entities, relationship between entities, and business rules. MDS provides only ONE SINGLE VERSION OF TRUTH for your data structure, this means that you can have only single version of customer information. or only single version of product info. this will help data consistency.


Finally for working with OLAP Cubes; you can use Analysis Services (SSAS) to build olap cubes on your data warehouse.


and if you want to create reports and dashboards from your cube; there are many data visualization tools in Microsoft stack, such as SSRS, PerformancePoint, Power View, Excel and Excel Services



http://www.rad.pasfu.com



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Change Data Capture - Pros/Cons/Gotchas


Change Data Capture - Pros/Cons/Gotchas



My company is thinking of implementing Change Data Capture. I find many online articles, which cover the basics of Change Data Capture, but no real-life examples of using it, the pros and cons, limitations or "gotchas" encountered during actual implementation and usage, etc. I could use such info as we determine whether Change Data Capture is the right solution for our needs.


Some details about our scenario:


1. Source data: We have currently about 200 entities that use one of our applications. The entities involved are all functioning units of a larger, single business entity, but are spread out geographically. Each entity and its copy of the application stores its data in a separate database, although all copies of the database have identical data structures and tables. They post summarative data on a routine basis to a couple of main servers, however, in each case this data is still stored in separate database on the servers - thus, still about 200 databases.
2. What we would like to do: Bring together this summarative data into a single relational data store, and from there into a dimensional database. We do not want to, nor can we, eliminate the source databases, on either the geographically-distributed sites or the main servers, b/c we don't "own" them.


Is this a feasible scenario for Change Data Capture? We would need to include data column(s) to indicate data sources b/c the data, although placed in a single database, must still be identifiable by data source and data from one source never overwrites data from a different source. As I understand it, Change Data Capture would need to be enabled on the 200+ databases on the main servers, from which we would then likely use SSIS to process the data to post it to the new database we would develop.


I would appreciate any advice, based on understanding and experience, of important things to consider, things to avoid, "gotchas", best practices, etc. We have worked quite a bit with SSIS, developing dimensional data structures, etc., so although any major comments there are welcomed, our primary "need to know" is how/if Change Data Capture can be used for our scenario, as we have no experience with it.

Thanks for your assistance,
Randy



.

social.technet.microsoft.com/Forums

Sunday, June 9, 2013

[SQL Server Data Warehousing] DW Newbie Question - Updating fact table when have type 2 dimension


DW Newbie Question - Updating fact table when have type 2 dimension



I'd suggest instead of updating the fact table and throwing away the natural partitioning you get with the historical surrogate key, you can add the dimension's durable key (non-changing natural key) to the fact table in addition to the existing dimension surrogate key.  With this design, you can slice the facts both ways: according to dimension's historical values (join to dim on surrogate key) and according to current values (join to current view of dim on durable key).


There's a good Kimball Design Tip on this topic here:


http://www.kimballgroup.com/2012/07/10/design-tip-147-durable-super-natural-keys/


Let me know if that helps.



Brent Greenwood, MS, MCITP, CBIP // Please mark correct answers and helpful posts // http://brentgreenwood.blogspot.com



.

social.technet.microsoft.com/Forums

Saturday, June 8, 2013

[SQL Server Data Warehousing] In SSIS package How can we call a Mcro


In SSIS package How can we call a Mcro



Hi All


I have a scenario to call a macro from SSIS  Can any one help me with the steps How to Add the PIA lib and refrence in Script task, i have done some work but iam getting fallowing error


Error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.InvalidCastException: Unable to cast COM object of type 'System.__ComObject' to class type 'Microsoft.Office.Interop.Excel.ApplicationClass'. COM components that enter the CLR and do not support IProvideClassInfo or that do not have any interop assembly registered will be wrapped in the __ComObject type. Instances of this type cannot be cast to any other class; however they can be cast to interfaces as long as the underlying COM component supports QueryInterface calls for the IID of the interface.


   at ST_9f928eae9e0d4fb3b1e14b45d2f15b3c.vbproj.ScriptMain.Main()


   --- End of inner exception stack trace ---


   at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)


   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)


   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)


   at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams)


   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTATaskScriptingEngine.ExecuteScript()


Thanks



.

social.technet.microsoft.com/Forums

Friday, June 7, 2013

[SQL Server Data Warehousing] T-SQL SCD2


T-SQL SCD2



Here is the query i am using in case if it helps.


To give you little more scenario  on my request  : Our Data warehouse is very traditional approach of ETL , We have update statments that does lookup and update the Foriegn keys.


My first update statments goes and finds an employee based on a logic of Data Entry



Update dbo.SFWP
Set EmployeeID = we2.EmployeeSK
from dbo.SFWP a
Inner Join WMDM.dbo.WMDM AT
on a.AssignedTaskID = AT.AssignedTaskID
Inner join WMDM.dbo.Employee we1
on AT.EmployeeID = we1.SK
Inner Join WMDM.dbo.Employee we2
on we1.NK = we2.NK
and Src_EntryDate >= we2.EffectiveBeginDTM and
Src_EntryDate <= we2.EffectiveEndDTM

Apparently for some reason some employees cannot be matched because  our employee table has this condition not met because of missing records


and  Src_EntryDate >= we2.EffectiveBeginDTM and
Src_EntryDate <= we2.EffectiveEndDTM


So, my new query i am trying to take a previous or Next record and find employee ID and update it.



wITH CustCTE as
(
select
We2.SK,
We2.NK,
We2.EffectiveBeginDTM,
We2.EffectiveEndDTM,
ROW_NUMBER() OVER (PARTITION BY We2.NK ORDER BY We2.EffectiveBeginDTM) RowNum
from dbo.SFWP a
Inner Join WMDM.dbo.WMDM AT
on a.AssignedTaskID = AT.AssignedTaskID
Inner join WMDM.dbo.Employee we1
on AT.EmployeeID = we1.SK
Inner Join WMDM.dbo.Employee we2
on we1.NK = we2.NK
Where
Src_EntryDate < we2.EffectiveBeginDTM or
Src_EntryDate > we2.EffectiveEndDTM
and a.EmployeeID is NULL
)
,
CustCTE1 as (
Select
a.SK
,a.NK
,a.EffectiveBeginDTM
,a.EffectiveEndDTM
,a.RowNum

From CustCTE a
Union All
Select
Curr.SK
,Curr.NK
,Curr.EffectiveBeginDTM
,Curr.EffectiveEndDTM
,Curr.RowNum

From CustCTE Curr
Inner Join CustCTE1 Prev
on Curr.NK = Prev.NK
and Curr.RowNum = Prev.RowNum-1
)
select * From CustCTE1

But i am not getting the same thing as i expect..


Any help in this matter will be highly appreciated.


thank you in advance



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] SSAS Cube not taking the data


SSAS Cube not taking the data



Please let us know if


1. You are getting any errors


2. How frequently is data updated in your view and for how long you are using the cube?


3. Have u scheduled for process of cube regularly



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Change Data Capture - Pros/Cons/Gotchas


Change Data Capture - Pros/Cons/Gotchas



My company is thinking of implementing Change Data Capture. I find many online articles, which cover the basics of Change Data Capture, but no real-life examples of using it, the pros and cons, limitations or "gotchas" encountered during actual implementation and usage, etc. I could use such info as we determine whether Change Data Capture is the right solution for our needs.


Some details about our scenario:


1. Source data: We have currently about 200 entities that use one of our applications. The entities involved are all functioning units of a larger, single business entity, but are spread out geographically. Each entity and its copy of the application stores its data in a separate database, although all copies of the database have identical data structures and tables. They post summarative data on a routine basis to a couple of main servers, however, in each case this data is still stored in separate database on the servers - thus, still about 200 databases.
2. What we would like to do: Bring together this summarative data into a single relational data store, and from there into a dimensional database. We do not want to, nor can we, eliminate the source databases, on either the geographically-distributed sites or the main servers, b/c we don't "own" them.


Is this a feasible scenario for Change Data Capture? We would need to include data column(s) to indicate data sources b/c the data, although placed in a single database, must still be identifiable by data source and data from one source never overwrites data from a different source. As I understand it, Change Data Capture would need to be enabled on the 200+ databases on the main servers, from which we would then likely use SSIS to process the data to post it to the new database we would develop.


I would appreciate any advice, based on understanding and experience, of important things to consider, things to avoid, "gotchas", best practices, etc. We have worked quite a bit with SSIS, developing dimensional data structures, etc., so although any major comments there are welcomed, our primary "need to know" is how/if Change Data Capture can be used for our scenario, as we have no experience with it.

Thanks for your assistance,
Randy



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Collecting data from multiple sites


Collecting data from multiple sites



Your question has two different parts;


1- archiving existing databases


2- Integrating ,cleansing, de-duplicating (maybe), data warehousing, ETL , Cube


for the first part there are many articles published for archiving SQL Server databases. for example:


http://www.mssqltips.com/sqlservertip/1121/archiving-data-in-sql-server/


I am going to answer the second part of your question, which is more related to Data Warehousing and BI;


First of all you need to integrate data from those several hundred locations into an integrated database (this can be called as integrating or staging area). SSIS is good data transfer tool which helps to a lot in transferring data from those sources into the integrated database. you can use different methods for transferring data in SSIS, use FTP, exported CSV files, connect to SQL Server databases (if live connection exists), web services ,....


You will also need to design the data warehouse, Kimball is one of the most common methodologies of designing data warehouse based on business requirements. you can use The Data Warehouse Toolkit book from Kimball to learn about Dimensional Modeling.


After designing the data warehouse you will require an ETL (Extract Transform Tool) to get data from the integrated database (or staging) into the data warehouse. SSIS again is very powerful ETL tool. it will provide massive amount of data transformation components for using in ETL process.


For data cleansing; I recommend using Data Quality Services (DQS) which is part of SQL Server 2012 services. with this service you can create knowledge bases for different business domains, and somebody from business will play role of Data Steward to maintain that knowledge base, In SSIS there is a DQS Cleansing component which talks with DQS and apply data quality rules of knowledge bases to incoming data stream in ETL process, and as a result you will have more cleansed data.


For data consistency and governance; The best way to keep data consistency is a master data management tool, Microsoft tool for MDM is Master Data Services (MDS). this tool is part of SQL Server installation as well, you will create data models with entities, relationship between entities, and business rules. MDS provides only ONE SINGLE VERSION OF TRUTH for your data structure, this means that you can have only single version of customer information. or only single version of product info. this will help data consistency.


Finally for working with OLAP Cubes; you can use Analysis Services (SSAS) to build olap cubes on your data warehouse.


and if you want to create reports and dashboards from your cube; there are many data visualization tools in Microsoft stack, such as SSRS, PerformancePoint, Power View, Excel and Excel Services



http://www.rad.pasfu.com



.

social.technet.microsoft.com/Forums

Thursday, June 6, 2013

[SQL Server Data Warehousing] DW Newbie Question - Updating fact table when have type 2 dimension


DW Newbie Question - Updating fact table when have type 2 dimension



I'd suggest instead of updating the fact table and throwing away the natural partitioning you get with the historical surrogate key, you can add the dimension's durable key (non-changing natural key) to the fact table in addition to the existing dimension surrogate key.  With this design, you can slice the facts both ways: according to dimension's historical values (join to dim on surrogate key) and according to current values (join to current view of dim on durable key).


There's a good Kimball Design Tip on this topic here:


http://www.kimballgroup.com/2012/07/10/design-tip-147-durable-super-natural-keys/


Let me know if that helps.



Brent Greenwood, MS, MCITP, CBIP // Please mark correct answers and helpful posts // http://brentgreenwood.blogspot.com



.

social.technet.microsoft.com/Forums

[SQL Server Data Warehousing] Hai all,


Microsoft is conducting an online survey to understand your opinion of the Technet Web site. If you choose to participate, the online survey will be presented to you when you leave the Technet Web site.


Would you like to participate?



.

social.technet.microsoft.com/Forums

Search This Blog