Thursday, April 4, 2013

[SQL Server 2008 issues] How to change the report server log directory

[SQL Server 2008 issues] How to change the report server log directory


How to change the report server log directory

Posted: 02 Apr 2013 10:10 PM PDT

Hi,I did the change in the config file found at at C:\Program Files\Microsoft SQL Server\MSRS10_50.XXX\Reporting Services\ReportServer\bin.By Default logs is created at location C:\Program Files\Microsoft SQL Server\MSRS10_50.EDDEV_DW\Reporting Services\Logfiles2.Go to RSTrace section and add below custom entry <add name="Directory" value="F:\Logs\" .3. Provide the service account with full permission to the folder F4.Restart the Reporting service .But no change the logs are still in default location ? Please do let me know what is the issue?

SQL migration

Posted: 05 Mar 2013 11:01 PM PST

Whats the best way to copy logins, users, logins with password along with exact status of enable/disable from source 2008 to destination server 2008

Including null complex type elements in XML output

Posted: 02 Apr 2013 10:02 PM PDT

Given two entities, mortgage and customer, each in a table and a third table linking customers to mortgages, I need to output as xml the data relating to each Mortgage.[code="sql"]create table Mortgage( AccountNo int )create table Customer( CustomerNo int )create table MortCustLink( AccountNo int ,CustomerNo int )insert into Mortgage values ( 123456 )insert into Mortgage values ( 234567 )insert into Mortgage values ( 345678 )insert into Customer values ( 123 )insert into Customer values ( 124 )insert into Customer values ( 125 )insert into MortCustLink values ( 123456, 123 )insert into MortCustLink values ( 234567, 124 )insert into MortCustLink values ( 234567, 125 )[/code]A mortgage may have 0, 1 or more customers associated with it.The xml output I am trying to produce is[code="xml"]<Mortgages> <Mortgage> <AccountNo>123456</AccountNo> <Customers> <Customer> <CustomerNo>123</CustomerNo> </Customer> </Customers> </Mortgage> <Mortgage> <AccountNo>234567</AccountNo> <Customers> <Customer> <CustomerNo>124</CustomerNo> </Customer> <Customer> <CustomerNo>125</CustomerNo> </Customer> </Customers> </Mortgage> <Mortgage> <AccountNo>345678</AccountNo> <Customers> </Customers> </Mortgage></Mortgages>[/code]where the 'Customers' element appears even when there are no customers linked to a mortgage.This is what I haven't managed to do and I'm running short on new ideas.Code I am using is[code="sql"]select Mortgage.AccountNo ,( select Customer.CustomerNo from MortCustLink join Customer on MortCustLink.CustomerNo = Customer.CustomerNo where MortCustLink.AccountNo = Mortgage.AccountNo for xml auto, root('Customers'), elements, type ) from Mortgagefor xml auto, root('Mortgages'), elements, type[/code]and the xml produced is[code="xml"]<Mortgages> <Mortgage> <AccountNo>123456</AccountNo> <Customers> <Customer> <CustomerNo>123</CustomerNo> </Customer> </Customers> </Mortgage> <Mortgage> <AccountNo>234567</AccountNo> <Customers> <Customer> <CustomerNo>124</CustomerNo> </Customer> <Customer> <CustomerNo>125</CustomerNo> </Customer> </Customers> </Mortgage> <Mortgage> <AccountNo>345678</AccountNo> </Mortgage></Mortgages>[/code]where the last Mortgage (AccountNo = 345678) has no Customers.Your suggestions much appreciated, thanks for your help

Getting a specific column name

Posted: 03 Apr 2013 08:20 AM PDT

I am trying to get the name of a column from a table from a specific value. For example, table1row C1 C2 C3row1 5 6 4row2 4 2 3First I find the max number (which would be 6 in row1) from a specific row, but I need the column name and not the column value. So in the end, it would return C2.Query to find the max number from a row (Only returns the column value):select (select Max(maxNum) from (VALUES (C1), (C2), (C3)) As value(maxNum)) as MaxNumber from table1 where row=row1Is this going in the right step or is there an easier method that could be done in a shorter amount of steps.Thanks

Issue: Index creation is not running in parallel

Posted: 15 May 2012 09:16 PM PDT

I have a customer who has a SQL Server 2008 R2 Enterprise Edition clustered instance. The instance is configured with max degree of parallelism=1 and cost threshold for parallelism=5. An index has to be recreated due to include two more columns, and I issue a statement like:[code]CREATE NONCLUSTERED INDEX [index] ON [table]( [column] ASC)INCLUDE ( [included_column_1], [included_column_2])WHERE ( [included_column_1] IS NULLWITH ( PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = ON, ONLINE = ON, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, MAXDOP=4) ON [PRIMARY][/code]The table has a few billion records. For some reason, even though I specify maxdop, the index creation is not running in parallel. Does anyone have an idea why?

MS SQL + MS Visual Studio VS Wherescape RED

Posted: 24 Dec 2010 07:53 PM PST

Have you ever heard of or used Wherescape RED to do the data warehouse developing? The fact is it is so popular in New Zealand, and maybe already hi-jacked the whole NZ BI industry. However, after I have been forced to use this product to do the MS SQL DW and database project for months, I found it has nowhere can compete with MS SQL + MS visual studio itself. AS an ETL tool, SSIS is far more flexible and scalable. As to version control, source safe or team fundation is far more sophisticated. in terms of documentation, the BI Documenter is certainly much advanced.What I also found is that Wherescape is also not able to keep up with the SQL server new functionalities? The autogenerated sql script is good on standardized point of view but largely diminished the developer's SQL skills.Indeed, I was totally lost why we need it under Microsoft SQL evironment? Could anybody give me your thoughts to clear my doubts?

Implementing CDC on All tables of the database

Posted: 03 Apr 2013 06:56 AM PDT

Hi,Could any one help me out with script that enables CDC on all the tables of a database.Thanks

From Websphere on AIX to MSSQL?

Posted: 03 Apr 2013 07:25 AM PDT

Hello,I have a new app that will run on Websphere Process Server on AIX that will connect to a SQL 2008r2 DB. It will use the MS JDBC driver. I am getting conflicting info on whether that setup can use windows authentication, which I would prefer. Anyone out there have experience with that? Is it a requirement that the AIX server be a member of the AD domain?thanks

Join on 2 tables where values are different

Posted: 02 Apr 2013 10:18 PM PDT

Hi everyoneI have 2 tables in SQL and i would like to join them together however the fields I wish to join on have slightly different information. Table1 has the field Region and Table2 has the field Region. However Table2s Region field contains a * either side of the string. So in Table1 it would appear as NORTH whilst in Table2 as *NORTH*.Is there a way I can join the 2 tables when they differ in this way?Thanks in advance.BO

Slow Cascade Stored Procedure & Hang

Posted: 14 Mar 2013 06:11 AM PDT

Table Image : http://tinypic.com/r/1075g6v/6So I have this query which searches id by id and calculates the cost accordingly. But is very slow and I would like to understand how I could convert it into a set based operation.So depending on our condition we calculate our modeled costs differently.When user updates a driver we can run a update on the entire driver column based on where it has changed.But when it comes to calculating the modeled cost. We do it row by row as the fixed cost differs and then divide by months. I have pasted the code below. Is there still a way to this by set based operation ?First we update the drivers in the same table depending on what value has changed using an update and then update the modeled cost row by row (which is really slow)Code :SELECT @rowCounter = 1, @totalrows = @@ROWCOUNTWHILE @rowCounter <= @totalrows BEGIN SELECT @currentId = tempId FROM @temp WHERE row = @rowCounter SELECT @newModeledCost = case when not exists (select 1 from dbo.DIMSTD_SCENARIO where SCENARIO0_Name = SCENARIO and SCENARIO2_Name = 'Model') then ISNULL(DriverValue1,0)*ISNULL(DriverValue2,0)*ISNULL(UnitA,0)*ISNULL(UnitB,0)+ISNULL(FixedCost,0) -- normal allocation for all scenarios else (ISNULL(unita,0) * (ISNULL(DriverValue1,0)/ISNULL(NULLIF(DriverValue2,0),1))* ISNULL(UnitB,0))+ISNULL(FixedCost,0) --(ISNULL(unita,0) * (ISNULL(DriverValue1,0)/ISNULL(DriverValue2,0))*ISNULL(UnitB,0))+ISNULL(FixedCost,0) -- allocation for model scenarios end , @oldModeledCost = ISNULL(ModeledCost,0), @newOct = (ISNULL(@newModeledCost,0) * (ISNULL(Oct, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newNov = (ISNULL(@newModeledCost,0) * (ISNULL(Nov, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newDec = (ISNULL(@newModeledCost,0) * (ISNULL(Dec, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newJan = (ISNULL(@newModeledCost,0) * (ISNULL(Jan, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newFeb = (ISNULL(@newModeledCost,0) * (ISNULL(Feb, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newMar = (ISNULL(@newModeledCost,0) * (ISNULL(Mar, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newApr = (ISNULL(@newModeledCost,0) * (ISNULL(Apr, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newMay = (ISNULL(@newModeledCost,0) * (ISNULL(May, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newJun = (ISNULL(@newModeledCost,0) * (ISNULL(Jun, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newJul = (ISNULL(@newModeledCost,0) * (ISNULL(Jul, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newAug = (ISNULL(@newModeledCost,0) * (ISNULL(Aug, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))), @newSep = (ISNULL(@newModeledCost,0) * (ISNULL(Sep, 0) /ISNULL(NULLIF(@oldModeledCost,0),1))) FROM dbo.TBF_BUDGETExpenses WHERE BudgetId = @currentId --and not exists (select 1 from dbo.DIMSTD_SCENARIO where SCENARIO0_Name = SCENARIO and SCENARIO2_Name = 'Model') UPDATE dbo.TBF_BUDGETExpenses SET ModeledCost = @newModeledCost, Oct = @newOct, Nov = @newNov, Dec = @newDec, Jan = @newJan, Feb = @newFeb, Mar = @newMar, Apr = @newApr, May = @newMay, Jun = @newJun, Jul = @newJul, Aug = @newAug, Sep = @newSep, Username = 'Cascade', lastmodified = getdate() WHERE BudgetId = @currentId AND @oldModeledCost <> 0 Print 'Record Update ' + CAST(@currentId AS VARCHAR(15)) SET @rowCounter = @rowCounter + 1END

Initial (MB) vs Autogrowth

Posted: 03 Apr 2013 12:09 AM PDT

I have a database that was set with an initial size of close to 10 gigs. I am down to 2% free space left with autogrowth settings of 50 Megs and unrestricted. The database grows approximately 50 megs per day. I know that a database can become fragmented the more the autogrowth occurs so I'm wondering if it's better to just change the Intial (MB) size to something larger (say 11 gigs).I'm really just trying to figure out the difference between changing the Initial (MB) setting for the database file size vs allowing autogrowth to happen. Does changing the Initial (MB) move the entire database file to a contigous physical space if it's been fragmented with autogrowth already? Does autogrowth add a contiguous space if it's available? How can I see if my database file is physically fragmented?Thank you in advance.

SQL Server Agent account for job

Posted: 03 Apr 2013 03:34 AM PDT

Hello - I have a developer that wants to set up a job to run ActiveX Script.They can't see anything under "Run As"according to http://msdn.microsoft.com/en-us/library/ms187668.aspxSet the proxy account for the job step. Members of the sysadmin fixed server role may also specify the SQL Agent Service AccountAdding a proxy means I need to go through the windows security team - which may not get approved.Is my only choice to make this login sysadmin? I don't want to do that either for security reasons.Am I missing a possibility? Does the account need sysadmin after the job is created?ThanksDave

*MAY* need to switch DBs from Full Recovery to Simple, due to backup space issues...

Posted: 03 Apr 2013 02:52 AM PDT

First and foremost, I'm working with our backup admin to find a better solution for the problem, that won't require doing this.Second, adding capacity to the backup system would be nice, but by the time the request were to make it through the bureaucracy here, we'd be migrating to SQL3010...:hehe:So the scoop is, the backup system here ran out of room for the Sharepoint SQL DBs, and now myself and the backup Admin are trying to get it working again. She's limited in what she can do by the retention policies, I'm limited by the fact that it's Sharepoint, so no going in and dropping rows from big tables...(The Sharepoint Admin is on vacation, to boot...)So I think what may have set the problem off was the SP admin, after realizing that the previous admin had enabled auditing of EVERYTHING and did not set up the "trim" option to only keep a month or so worth of entries, tried to delete out old records. With the DBs being in Full recovery, the TLogs got rather full.Then the backup ran out of space because of this (I think the retention is 30 days, then it goes to tape)So my worst-case solution to reduce the size of data in the TLogs is to do the following:1. After-hours, switch all DBs from Full Recovery to Simple, sacrificing point-in-time recovery2. After SQL has "truncated" the logs, switch back to Full Recovery3. Let the early AM Full DB backup run, and get back on our bi-hourly TLog backupsYes we lose PiT recovery, but right now we have NO recovery since yesterday...Yes, I should have caught this sooner.So, a question:About how long might it be before SQL auto-truncates the logs? Can I "force" it to happen sooner by issuing a "Checkpoint" on the DBs, and could it take more than one checkpoint?I'm not currently planning to shrink the physical files, so that SQL doesn't need to "grow" the file later.Thanks,Jason(As an example, one of the Log files is showing 18655.19MB used in the log... I had been monitoring the sizes of the DBs and TLogs for a couple weeks, looking to free up some disk, and this DB averaged 402MB...)

Output Parameter in Stored PRocedure

Posted: 02 Apr 2013 08:40 PM PDT

Hi all, I have created a sp in which I want to extract last identity value after doing insertion.... But its giving me error that @new_identity parameter is not suppliedCreate Procedure [dbo].[BS_StoreAllocation_AddSTNDetails] @StoreCode int = 1, @CourierName varchar(30) = 'BLUE Dart', @CourierNo int = 98765732, @new_identity int OUTPUT ASBEGINSET NOCOUNT ON; DECLARE @STN varchar(20)INSERT INTO GV_STNDetailsVALUES( REPLACE(STR(@StoreCode,4),' ','0'), @CourierName, @CourierNo, GETDATE(), CONVERT(VARCHAR,LEFT(@CourierName,3)) + '-' + CONVERT(VARCHAR,@CourierNo) + CONVERT(VARCHAR(10),GETDATE(),112) ) SELECT @new_identity = SCOPE_IDENTITY() RETURN END

Linked server

Posted: 03 Apr 2013 03:35 AM PDT

I created a linked server in ServerA to ServerB but another user created linked server in ServerB to ServerA. Does it cause performance issue?

Database Restore Errors

Posted: 03 Apr 2013 03:53 AM PDT

Hello, I am running the below restore script and getting errors. Please advise what I need to do to correct.[code]RESTORE DATABASE [TestDB1] FROM DISK = N'S:\TestDB1\TestDB1_Backup_20120625.bak' WITH FILE = 1, MOVE N'NG_System_Data' TO N'E:\TestDB1\TestDB1.mdf', MOVE N'NG_Core_Data_1' TO N'E:\TestDB1\TestDB1.mdf', MOVE N'NG_Index_1' TO N'E:\TestDB1\TestDB1.mdf', MOVE N'NG_Log' TO N'E:\TestDB1\TestDB1_1.ldf', NOUNLOAD, REPLACE, STATS = 10GO[/code]I get the following errors:[code]Msg 3176, Level 16, State 1, Line 1File 'E:\TestDB1\TestDB1.mdf' is claimed by 'NG_Core_Data_1'(4) and 'NG_System_Data'(1). The WITH MOVE clause can be used to relocate one or more files.Msg 3176, Level 16, State 1, Line 1File 'E:\TestDB1\TestDB1.mdf' is claimed by 'NG_Index_1'(3) and 'NG_System_Data'(1). The WITH MOVE clause can be used to relocate one or more files.Msg 3013, Level 16, State 1, Line 1RESTORE DATABASE is terminating abnormally.[/code]

T-SQL help

Posted: 03 Apr 2013 03:28 AM PDT

HelloI need one helpI have list of dates from those date I need to display all dates with schoolIDsThe list of dates is[code="sql"]create table #temp(date datetime,IsWeekDay int,SchoolDay int)insert into #temp values('2011-08-06 00:00:00.000',0,0)insert into #temp values('2011-08-07 00:00:00.000',0,0)insert into #temp values('2011-08-08 00:00:00.000',0,0)insert into #temp values('2011-08-09 00:00:00.000',0,0)insert into #temp values('2011-08-10 00:00:00.000',0,0)insert into #temp values('2011-08-11 00:00:00.000',0,0)insert into #temp values('2011-08-12 00:00:00.000',0,0)insert into #temp values('2011-08-13 00:00:00.000',0,0)insert into #temp values('2011-08-14 00:00:00.000',0,0)insert into #temp values('2011-08-15 00:00:00.000',0,0)insert into #temp values('2011-08-16 00:00:00.000',0,0)insert into #temp values('2011-08-17 00:00:00.000',0,0)insert into #temp values('2011-08-18 00:00:00.000',0,0)insert into #temp values('2011-08-19 00:00:00.000',0,0)insert into #temp values('2011-08-20 00:00:00.000',0,0)insert into #temp values('2011-08-21 00:00:00.000',0,0)create table #temp2(SchoolID int)insert into #temp2 values (101)insert into #temp2 values (102)[/code]the desired output as below[code="plain"]date SchoolID2011-08-06 00:00:00.000 1012011-08-07 00:00:00.000 101 2011-08-08 00:00:00.000 101 2011-08-09 00:00:00.000 1012011-08-10 00:00:00.000 1012011-08-11 00:00:00.000 1012011-08-12 00:00:00.000 1012011-08-13 00:00:00.000 1012011-08-14 00:00:00.000 1012011-08-15 00:00:00.000 1012011-08-16 00:00:00.000 1012011-08-17 00:00:00.000 1012011-08-18 00:00:00.000 1012011-08-19 00:00:00.000 1012011-08-20 00:00:00.000 1012011-08-21 00:00:00.000 1012011-08-06 00:00:00.000 1022011-08-07 00:00:00.000 1022011-08-08 00:00:00.000 1022011-08-09 00:00:00.000 1022011-08-10 00:00:00.000 1022011-08-11 00:00:00.000 1022011-08-12 00:00:00.000 1022011-08-13 00:00:00.000 1022011-08-14 00:00:00.000 1022011-08-15 00:00:00.000 1022011-08-16 00:00:00.000 1022011-08-17 00:00:00.000 1022011-08-18 00:00:00.000 1022011-08-19 00:00:00.000 1022011-08-20 00:00:00.000 1022011-08-21 00:00:00.000 102[/code]please help me to do this

Transaction Log Backup Always Generates Job Failed Message

Posted: 03 Apr 2013 01:43 AM PDT

Hi everyone,I am having problems getting my daily transaction log backups to work on a maintenance plan. The plan is set up to create a trn transaction log backup and notify the operator if the job errors and fails to run.For some reason, the job will execute with no errors, but it will send me a "job failed" email. It does create the trn backup files too. This defeats the purpose of notify since I only want to be notified if the job really did fail. Since there is no real error, I can't find anything in the log history to tell me why I got the fail message.Has anyone ever encountered this problem?

ANSI settings

Posted: 03 Apr 2013 01:25 AM PDT

Hi All,There is an application that uses DSN. The connection type is ODBC. In the DSN the ANSI settings are set to ON and on database level they are all OFF ([url]http://msdn.microsoft.com/en-us/library/ms175088(v=sql.105).aspx[/url]). I'm aware that changing them can cause troubles to the application. The application is a "black box" and does not use stored procedures, i.e. only in-code queries.Do these differences in the settings cause less reuse of the query plans?BRIgorMi

Replication - missing records and large Msrepl_commands table

Posted: 02 Apr 2013 10:50 PM PDT

HiWe have run into a problem with missing data in our subscriber when using replication even though replication manager says its up to date. I noticed we did have an message in our subscriber saying Query Timeout exceeded, and some searching led me to look at the MSrepl_commands table in the distributor which has around 30million records in it.Since typing this the data has replicated now, however has anyone seen this before. Also is this amount of records in the MSrepl_commands normal?Thanks

Restore Change Data Capture database without KEEP_CDC option

Posted: 16 May 2012 04:19 AM PDT

Hello,Suppose you have a database that you need to restore on another server. This database has SQL Server CDC enabled, but you don't need the change data on the restored copy. If you run a restore command without the KEEP_CDC flag, this yields the result you seek.However, what we've noticed is that, after the restore is almost completely done, we see a series of SELECT * from dbo_Table1_CT into dbo_Table1_CT commands in the DMVs for running queries. It is as if SQL Server is reindexing or performing some other operation on the capture instance tables, even though it will ultimately drop those tables. Does anyone have any experience with this to help us understand what the restore process is doing with the _CT tables? This takes quite some time and in the end, the tables aren't even there. One option we haven't explored yet is a piecemeal restore without restoring the filegroup that contains our CDC data, but we thought the simple restore statement without KEEP_CDC would do that for us, without taking what can be a long time to manipulate, and then drop, the _CT tables.Thanks!

Snapshot, Replication...something to get OpenEdge into SQL 2008 R2

Posted: 03 Apr 2013 12:48 AM PDT

I would love to setup a transactional replication between OpenEdge 11 and SQL 2008 R2...if possible.I have OpenEdge setup as a linked server and can access the tables...progress ODBC connection.I tried to create a SSIS package to at least get a snapshot nightly, but there is a problem with it seeing datatype 23...I then tried to add a convert piece to the SSIS package...still failed.I then tried to add a category listing so 23 = date in the mapping XML files...still failed.There are several tables and I find it strange that I can do one Table: Select * Into <sqltable> from <linkedserver...table>.....and this works AND pulls the date field with no error like the error above.I obviously do not want to do this for each table.Goal 1: get a snapshot of the linked server tables into SQL nightlyGoal 2: try to setup a transactional replicationAny help is greatly appreciated.

Query Performance

Posted: 03 Apr 2013 12:03 AM PDT

Hi, I have written a script in which I am performing various scenario.. First I will validate if a voucher is already in use or not then if a voucher is not in use then I will insert into a table..after inserting data into table I have to change the status of those voucher in some another table..Is is correct way to write a script or it will have performance issues later. please suggestDECLARE @storecode int,@STNNo varchar(20),@vochertype int,@Isbooklet int,@denomination int,@quantity int,@FirstVoucherID varchar(10),@lastvoucherID varchar(10)IF EXISTS(SELECT * FROM GV_Voucher gv JOIN GV_VoucherStatus gvs ON gv.VoucherStatusId = gvs.VoucherStatusIdWHERE gv.VoucherNo = @FirstVoucherID AND gv.VoucherNo = @lastvoucherID AND gvs.VoucherStatus <> 'NEW')BEGINPRINT 'Voucher No already in use'ENDELSEBEGIN INSERT INTO GV_StoreAllocation (StoreId, VoucherId, STNNo, GRNNo, VoucherStatusId, DispatchDate, ReceivedDate)VALUES (@storecode,@STNNo,@FirstVoucherID,@lastvoucherID,@quantity)END DECLARE @voucherlist TABLE(VoucherNo varchar(10))INSERT INTO @voucherlistSELECT gv.VoucherNoFROM GV_Voucher gv WHERE gv.VoucherNo BETWEEN @FirstVoucherID AND @lastvoucherIDUPDATE GVSET GV.VoucherStatusId = gvs.VoucherStatusIdFROM GV_Voucher GVJOIN @voucherlist vc ON vc.VoucherNo = GV.VoucherNoJOIN GV_VoucherStatus gvs ON gvs.VoucherStatusId = GV.VoucherStatusIdWHERE gvs.VoucherStatus = 'Dispatched'

Mapping Different Logins to Same User in Database

Posted: 02 Apr 2013 10:54 PM PDT

Hi,My question is more on simplifying things for my environment so hoping that I can clearly propose the idea on which I have been pondering.When we refresh a dev or QA database from prod, a lot of prod users come to these environment and when we extract permissions before refresh, the script also contains these unwanted users.Lets say that I have 20 logins in DEV instance so rather than doing the conventional thing after a refresh, can we simply drop all the users from a database and map these 20 logins to a single database user with the sufficient permissions.For example if I have a login called domain\LoginA and domain\Login B, can I have a single user called dev_db in a database and map these two logins to this single user so that while extracting permissions only this single user is extracted and after each refresh, we cleanup the unwanted users and create this single one only.Is there a possibility? Sorry if my question is not too clear, let me know I will try to explain in another way.RegardsChandan Jha

SSIS SQL Server and File Server Setup

Posted: 03 Apr 2013 12:30 AM PDT

Good morning, Wondering if I could get some opinions on a setup for an ETL environment with hardware that has been provided. The client has setup a ETL box and a DB box. He would like the ETL box to handle most of the processing as far as installation of the package. Running of the ETL SSIS package through SQL Server Agent from the ETL box and the DB server just to be the recipient of the data. All files would be processed off of a share. Does anyone have any insight, websites, and or suggestions on best practices and things to look out for regarding a two server setup with ETL.

Need Create View Script for all tables

Posted: 21 Aug 2012 06:37 PM PDT

I manage a SQL 2008 R2 Production environment for an application. We have a new requirement to create views( simple--select * from tablename) for all existing tables (182 tables) in the database.We will then provide access to these views for Reporting Service users to create their reports. This is so as not to affect database performance for App users.[b]Can someone provide me a script which generates 'create view' scripts for all existing user tables in the database?[/b]Thanks in advance.

Using AOG spanning FCI.

Posted: 03 Apr 2013 12:06 AM PDT

I have read that all disk sizes and drive letters need to be the same when using AOG.Is this just for the Drives which hold the MDF and LDF files for dbs in the group.Not much on the net about this but I remember someone stating this last year some time.

Migration detail for SQL SERVER 2000 TO SQL SERVER 2012

Posted: 02 Apr 2013 09:12 PM PDT

HIi need help for migration the coding in sql server 2000 to sql server 2012 . regarding this what are the step taken from initial or any basic configuration need to setup before migration to step up step action to final stage please advise or guide mewith regardssankaripr

Script out Agent job to run DBCC CheckDB or Maintenance Plan Check DB Integrity Task?

Posted: 02 Apr 2013 10:40 PM PDT

Or is this a "six of one, half-a-dozen of the other" sort of question, which really will boil down to personal preference?From reading the MSDN description of the "Check DB Integrity" task, it sounds like it will do what DBCC CheckDB does, although possibly the "reporting" of problems might not be a "full featured" as from a custom script.Any thoughts?Beyond the obvious of "why aren't you already checking your DBs?"(Answer: I inherited the current situation, and am working on resolving stuff like this)Thanks,Jason

TDE

Posted: 02 Apr 2013 09:18 PM PDT

HI,Can any one help me to scipt to Enabling TDE(Transparent Data Encryption) on all user databases.Please help.thanks

No comments:

Post a Comment

Search This Blog