Testing with Profiler Custom Events and Database Snapshots
Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.
Read more
sqlteam.com
Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.
sqlteam.com
Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.
sqlteam.com
Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.
sqlteam.com
Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.
sqlteam.com
[MS SQL Server] Copy data between 2 networks |
Posted: 15 Oct 2013 12:40 AM PDT Hi Friends,I have around 100 tables to copy from one server to another. The problem is that these 2 servers are not in the same network. Is there any way in which I can achieve this. May be copy data in a disc and and then copy it to the other servers.Please could you suggest few ways of doing this. Thanks |
Intangible updates at the publisher database Posted: 14 Oct 2013 11:32 PM PDT Good day, colleagues . I encountered with a problem and need advice. SQL Server 2008R2. I have a merge replication with publisher and N subscribers.Periodically an update conflict occurs in replication. It is not a problem, it's okay if both sides have updated record, but in my case the record is updated with only one side.Why so adamantly. First, all user stored procedures which updates a values in record has a script that logs the date and time of renewal in special columns, secondly - the application-layer forbids updating records when current user did not create them, and thirdly, today I created the triggers on the table at the publisher who write all DML operations in the log table.So in the log-table I see the event of creating new record, it was created at the Subscriber and come with replication on the publisher – ok! Then user updates a record on subscriber and I see the update conflict for this record, but I do not see any fact of update that record in the log-tables on publisher. Conflict viewer says to me that both the Publisher and Subscriber data is changed but I do not see a time of renewal in special columns, and all other values are identical. The question is: how Publisher can update records in the table, so that the triggers do not fire? Or which process and why, for some reason generates updates system tables ("genhistory" or "contents"), which leads to conflict?Sorry my poor English, did not have enough practice last time. |
Posted: 14 Oct 2013 12:59 PM PDT Hi,I have a table simple nothing to complicatedCREATE TABLE [dbo].[RHH] ([UNIQUE_ID] varchar(50) NULL,[First_Name] varchar(50) NULL,[Last_Name] nvarchar(50) NULL,[email] nvarchar(50) NULL)ON [PRIMARY]with this table all the data gets inserted via an application except the UNIQUE_ID. This I am looking to increment as the data is inserted. Also it has to be in the format of say RHH100100 +1.Im thinking an on insert trigger to update the column but not sure how to get the last value and increment that. ? |
You are subscribed to email updates from SQLServerCentral / SQL Server 2008 / SQL Server 2008 Administration To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
[Articles] The Cloud of Destiny |
Posted: 14 Oct 2013 11:00 PM PDT Is SQL Server going to move to the "cloud?" Steve Jones thinks it is and has a few thoughts as he starts off with "cloud week" here at SQLServerCentral. |
You are subscribed to email updates from SQLServerCentral.com Articles tagged Editorial To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
[SQL 2012] copying data tables between database and keeping indexes and identity columns |
copying data tables between database and keeping indexes and identity columns Posted: 15 Oct 2013 03:26 AM PDT Is there a way to copy tables between databases (ssis?) and keep the identity columns and indexes other than tsql scripts. Wondering if SSIS ever got any better at this |
Can a server be added to two different clusters? Posted: 14 Oct 2013 10:04 AM PDT As part of a re-design of our disaster recovery and high availability environment we are considering several designs. One design would mean trying to setup a server in one location that is part of a local 2 mode cluster so that a SQL instance would remain up if there was a hardware failure (the instance would failover to the other node). Let's say the location is New York.Let's say then we also have a datacenter in Los Angeles. It also has a SQL instance clustered on it's own two nodes located in Los Angeles.So the instance SqlLA\InstanceOne would be a clustered instance on the nodes LA01 and LA02 in LA. The instance SqlNY\InstanceTwo would be a clustered instance on nodes NY01 and NY02 in NY.Then I would want to setup an HA group between SqlLA\InstanceOne and SqlNY\InstanceTwo. Although that would mean the nodes would need to be a resource in two different Microsoft clusters. I would have to add NY01 and NY02 as nodes to the LA cluster even though they are already a part of the NY cluster.The idea is that if there was a real distaster event for say LA then the nodes in NY would take over the availability group. Then if there was a hardware failure on one of the nodes in NY, then the SQL instance in NY is still up.My gut reaction is that this seems like a really bad idea technically as the two clusters would not coordinate well. Although I'm not sure at all, maybe this would work.Does anybody know? |
Posted: 14 Oct 2013 07:57 AM PDT I have a tab-separated-value (TSV) input file for my SSIS package. When I point to the file to create the Flat File Source, SSIS uses a string data type (OK, I can deal with that...) and a length of 50. Well, some of the columns are bigger than 50 characters. In my data conversion step, I ignore truncation, but I'd rather the input file allow up to 255 characters. I went to the advanced edit for the flat file source component of the Data Flow step and changed the properties of one field I know violates the 50-character limit to 255 characters. It worked, but I got a warning that the metadata didn't match the properties sheet. Did I do this wrong? I don't want silly warnings.... |
Posted: 14 Oct 2013 08:12 AM PDT I'm having trouble accessing TFS using the 2012 SQL Server Data Tools. What's odd is that in Visual Studio 2012 under the Source Control Options, TFS is listed, but in SSDT in the same location there is only a None option. SSDT is basically a shell of VS 2012. I don't get it! Anyone successfully use TFS with SSDT for 2012? |
You are subscribed to email updates from SQLServerCentral / SQL Server 2012 / SQL 2012 - General To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
Weekly SQL Server newsletter with articles, forum posts, and blog posts via email. Subscribers receive our white paper with performance tips for developers.
sqlteam.com
[T-SQL] Query Help |
Posted: 14 Oct 2013 09:19 AM PDT drop table #TypeCodecreate table #TypeCode(TypeCodeId int identity(1,1), TypeCode bigint, Name varchar(50))insert into #TypeCode values (3001, 'Loan Type'), (3004, 'Transaction Type'), (3009, 'Event Type'), (3015, 'Participant Type')-- select * from #TypeCodedrop table #LDcreate table #LD(TDID bigint identity(1,1), TranID bigint, LoanID bigint, [Description] varchar(500))insert into #LD values (1234, 5467, 'sdasdas asd asdas'), (1232, 2357, 'nbvnvb vbjjytuytu yt'), (1344, 7989, 'yuimhjkhj'), (1734, 2456, 'retret ghjghjghj'), (1854, 8883, 'ewrgghjg hhgj'), (1784, 2357, 'ytuyu ghjghjghj'), (1284, 9734, 'werewrew ewrwerw')-- select * from #LDdrop table #PDcreate table #PD(PDID bigint, TranID bigint, [Description] varchar(500))insert into #PD values (3456, 1234, 'fghfg fghf hfgh'), (2135, 1234, 'sdjhjk hjkhj khjk'), (7564, 1854, 'wewqg ghjghj hgj')-- select * from #PD drop table #EventLogcreate table #EventLog(EID int, SourceID bigint, SourceType bigint, Description varchar(500))insert into #EventLog values (1, 1234, 3004, 'sadasdadadasfsdfsd fsdfsdf'), -- 3004 means its TranID from #LD (2, 1234, 3004, 'erewrwerwerw '), (3, 1234, 3004, 'erewsfsdf sdfsf sdf '), (4, 5467, 3001, 'ertre rett sdfsf sdf '), -- 3001 means its LoanID from #LD (5, 5467, 3001, 'kljkljkl rett sdfsf sdf '), (6, 3456, 3015, 'utyutyu tyu yutyutuy '), -- 3015 means its PDID from #PD (7, 1784, 3004, 'sadsadasdasd asdasdsad ') Now basically in event log for same transaction id I have different source type id data. I have one store procedure which I pass only #LD TranID and I want it return all event log for all different source type data for that transaction.Like If I pass TranID 1234 sp should return EID 1 to 6 |
please correct this query where am going wrong Posted: 14 Oct 2013 11:14 PM PDT I am trying to add this query in data set. Basically we are using cascading parametersSELECT DISTINCT C.COMPANY_SK,C.COMPANY_DESC FROM DIVISIONMAPPING D INNER JOIN VW_DIM_COMPANY C ON C.COMPANYID=D.COMPANYID WHERE D.ACTIVE IN ('YES','NO') AND (D.BUSINESSDIVISION in @Division OR @Division in ( 'All')) AND D.EXCLUDEFLAG=0ORDER BY COMPANY_DESC ASCplease correct me regarding this issue |
display 12 month list through query Posted: 14 Oct 2013 04:34 PM PDT Hi All,I am havinfg a column in a table which store a date. say for eg birthDate. I want to display count monthwise.But in case if there is no birthday falls in March Month then i need to display column header as 3 and under that 0 (zero) count. Can you please suggest any idea.ThanksAbhas. |
sorting by month with current month at top Posted: 14 Oct 2013 07:39 AM PDT HiIn one of my query I need data to be sorted starting with current month. Here is sample script:create table OrderTest(OrderDate datetime,orderCount int)GOinsert into OrderTest values('2013/10/15',30)insert into OrderTest values('2013/10/17',40)insert into OrderTest values('2013/09/13',60)insert into OrderTest values('2013/08/21',45)insert into OrderTest values('2013/07/13',38)insert into OrderTest values('2013/06/15',23)insert into OrderTest values('2013/05/15',56)insert into OrderTest values('2013/04/15',267)insert into OrderTest values('2013/03/19',67)insert into OrderTest values('2013/02/15',45)insert into OrderTest values('2013/01/15',90)insert into OrderTest values('2012/11/15',22)insert into OrderTest values('2012/12/15',76)GOselect datepart(mm,OrderDate) MonthNum,AVG(orderCount) AvgOrderCountFROM OrderTestGROUP BY datepart(mm,OrderDate)Order BY datepart(mm,OrderDate)I need data like:10 3011 2212 761 902 453 674 2675 566 237 388 459 60thanks |
You are subscribed to email updates from SQLServerCentral / SQL Server 2008 / T-SQL (SS2K8) To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
[SQL Server] How to update theese records? |
Posted: 13 Oct 2013 10:46 PM PDT Hi All!I have this Query:select * from dbo.dutyrostershift A join dbo.timeaccountmovement B ON B.ownerid = A.idwhere( A.fromtime <> B.starttime or A.totime <> B.endtime)And B.timeaccountid = 12AND A.std = 1AND B.ownertype = 1Basically, it finds records which has mismatched start and endtimes - they should be indentical in dbo.dutyrostershift and dbo.timeaccountmovement in the pairs where the ownerid in dbo.timeaccountmovement is equal to the ID in tdbo.dutyrostershift.The Query finds the records correctly.From here, it should be simple to update dbo.timeaccountmovement with the correct values from dbo.dutyrostershift, but I must admit, that iI have not found out how yet. Any WHO can/will help?Best regardsEdvard Korsbæk |
You are subscribed to email updates from SQLServerCentral / SQL Server 2008 / SQL Server Newbies To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
[how to] Index spool after created PK and index? |
Index spool after created PK and index? Posted: 14 Oct 2013 08:37 PM PDT I have the following query. It runs slow but get some value when there is not a PK on I added a PK on How to avoid the "Index spool" on the hugeTable? |
Error when creating an index in SQL 2012 Posted: 14 Oct 2013 08:35 PM PDT I'm trying to create the following index on our test environment (SQL 2012): on our transaction table: I get the following error: If I run the same create index statement on our live database (SQL 2005) then it works without any errors. Why can't I create this index on SQL 2012? UPDATE: Results of dbcc checktable: |
Can MySQL Workbench generate data? Posted: 14 Oct 2013 07:10 PM PDT Is there a way to generate mass rows of dummy data in a table for testing in workbench? If not, are there any free tools out there that is able to do that? |
Where EVENTS history are logged/recorded in MySql? Posted: 14 Oct 2013 08:51 PM PDT I just setup some stored procedures to run in the event scheduler using CREATE EVENT. I'm trying to find where (if somewhere) the history of it running is stored. I looked at the docs but couldn't find anything. Is there some table or log where I can see that my scheduled events successfully ran? Thanks! |
Posted: 14 Oct 2013 06:12 PM PDT I am a newbie to SQL and i have a question here. I am working in Postgresql. My tables look like change action field I would want to write a query to get the following view Basically i want to flatten(rows into columns) the 'change' table and join it with 'action' table. Please note that 'Change' table references to 'action' and 'field' tables Schema diagram looks like below Any help is appreciated, Thanks |
Improve Speed of MySQL Database Import Posted: 14 Oct 2013 03:36 PM PDT I'm importing around 44 GB (14 million rows) of information into an InnoDB MySQL database, and I'd like to do so in a reasonable timeframe. Since I'm doing this on a personal computer, I only have 8 GB of RAM available to me (and I'm using 1.5 GB for the MySQL database). Right now, I'm using LOAD DATA INFILE. I've tweaked a lot of the InnoDB settings, including increasing the buffer pool size and buffer log size. However, the speed of this method seems to drop rapidly as the amount of information in the database increases - the first "chunk" of data (100 MB) loaded in 79 seconds, but by the time I reached the fourth chunk of data, it took 1004 seconds to load the same amount. Unfortunately, this precipitous drop in transfer rate makes it nearly impossible to load all of the data onto the MySQL server in a reasonable timeframe. I had several questions about the scenario. Why does the transfer rate drop so drastically? I've heard that MyISAM is more efficient for loading data, so should I try to use a MyISAM engine to load this data instead? (I'll be accessing the data only locally) Are there any other techniques I can use to expedite the data loading process? |
What db permissions are needed for pgAgent user? Posted: 14 Oct 2013 12:45 PM PDT I have successfully set up pgAgent daemon (running on the same server as Postgres 9.3). I would like to restrict permissions on pgAgent. Created 'pgagent' login role, and granted it (via group role) all permissions on postgres.pgagent schema: However pgAgent refuses to execute any jobs, and just sits there idly. No error messages in logs. If I start pgagent as 'postgres' user it runs fine. Or, if I 'grant postgres to scheduler' it also runs fine. What permissions am I missing here? |
Configuring MySQL for Power Failure Posted: 14 Oct 2013 12:25 PM PDT I have absolutely no experience with databases and MySql. Now the problem is I have an embedded device running a MySQL database with a web based application. The problem is when I shutdown my embedded device it just cut off the power, and I can not have a controlled shutdown. Given this situation how can I configure MySql to prevent it from failures and in case of a failure, I should have maximum support to recover my database. While searching this, I came across InnoDB Engine as well as some configuration options to set like sync_binlog=1 & innodb_flush_log_at_trx_commit=1. I have noticed my default Engine is InnoDB and binary logs are also enabled. What are other configurations to make for best possible failure & recovery support. Regards, Farrukh Arshad |
"Set Password" with a limited user Posted: 14 Oct 2013 04:20 PM PDT I'm using MySQL 5.6 and I want to have a limited service-user which is able to create/modify the users of my databases. Now my problem is that I can quite figure out what privileges my service-user needs to perform all the user administration tasks (create user, grant/revoke db privileges, set password) a) If I give him the global "create user" and "reload" privileges, he can't use "set password" [= MySQL Error: 1044 (Access denied)] b) If I give him "select", "insert", "update" on 'mysql'.'user' and 'mysql'.'db', he can't use "set password" [= MySQL Error: 1044 (Access denied)] c) If I give him "select", "insert", "update" on 'mysql'.*, he CAN use "set password" I'd like to understand why this is happening, and how to achieve my approach a) or b). I don't want to use approach c). Can somebody help me out? Thank you! Benjamin |
Not sure why code is failing when CONCAT_NULL_YIELDS_NULL is set to OFF Posted: 14 Oct 2013 12:18 PM PDT In my question over at Pull every other row, starting a 0, and concatenate them together user @AaronBertrand provided me with a chunk of code that worked fine in test but as soon as I put the procedure using the code into place and tested from the calling app I get an error.
In testing if I set |
Posted: 14 Oct 2013 11:46 AM PDT I have the following table: I want to order it by Items. I used ORDER BY items ASC, but it returns: I want to return: I think this might be a silly question, but I really don't have any ideas about how to solve this. Thanks. |
how to trigger a statistics update Posted: 14 Oct 2013 12:25 PM PDT I have a certain table in my OLTP database, which is bulk-updated by several users. The problem is, that the update does not cause the statistics to be automatically updated, My question is how should i trigger a statistics update?
Of course the solution has to be server side, application changes are not welcome :) One more thing is that I might need the same or similiar for more than just this one table so the solution has to be generic (use a modified-sp_updatestats with parameters for... everything?). |
SQL Server 2012 Restore backup to new DB Name Posted: 14 Oct 2013 09:30 AM PDT I seem to remember that, in 2008, you could restore a backup to a new copy of a DB, by changing the name in the "Destination Database" field from the restore-wizard. It would create a brand new DB, which is a copy of the original DB restored to the point in time that you wanted. I have not for the life of me figured out how to make SQL 2012 do this. Now, I understand (thanks to Aaron Bertrand) that this didn't really change, and that 2012 is actually making it more obvious to me that this strategy was a bad idea in the first place! So, what I need to do is this: Create a new DB, 'MyDB_Copy', from an existing DB, 'MyDB', by using its backup files. We have nightly full-backups (.bak) and every-15-minute TLogs (.trn). I don't want the existing 'MyDB' to be affected/touched at all, because it's "live". After the MyDB_Copy is created from the main full-backup file, I then need to restore a few dozen TLog backups to get it to a certain point in time. |
Average of a row or record across three columns Posted: 14 Oct 2013 09:55 AM PDT The
. Similarly is there any query to calculate avg(col1, col2, col3.....) ? |
Posted: 14 Oct 2013 11:50 AM PDT Using MS SQL Server 2012 in Microsoft Windows Server 2008. I am somewhat confused by the New Account Types Available with Windows 7 and Windows Server 2008 R2 virtual Windows accounts Is it viable of daring to give permissions to, for example, [NT Service\SQLSERVERAGENT]' to access a shared resource (or local file or directory)? While browsing (or pressing button Find< or adding group and users to a file share) the available in domain and/or local/remote computer, there is no such accounts available and entering it manually give an error:
Related (though different) question that provoked this one: How to copy bak files to remote share (without AD/domain accounts involvement)? |
Posted: 14 Oct 2013 06:43 PM PDT MS SQL Server 2012... with nightly databases backups to the same/local as SQL Server machine... I am trying to add another SQL Server Agent Job to copy the .bak files to remote (non-windows, i.e. Linux) share with non-Windows (non-AD) user/password credentials. I do not have any access to configuring or changing that access which is under control of other, quite remote people. For this (copying) I created local user with the same user name and password, gave it permissions to the (source or local) backup-folders upon which all perfecly works from command line ( but fails to run as SQL Server Agent job (type of step is " Can anybody explain me why it is failing and how to correctly make it running (taking into account that I do not have access to domain users configuration)? |
Doing a point in time restore with CDC enabled; Possible? Posted: 14 Oct 2013 09:54 AM PDT I discovered this week, the hard way, that the database restore options Doing research on MSDN etc. I cannot find any documentation on restoring a database with I was able to find one attempt that did the full and subsequent logs without the If the intent is to |
Posted: 14 Oct 2013 02:17 PM PDT In my centos 6.3 server I have a MySQL 5.5.33 database. But I have some cron queries that runs every 3 hours. They don't use any index, they runs very slow and every query runs nearly 1500-2000 ms. I don't plan to add new indexes for them, because in that case I have to add many indexes and that queries run very rare. When I restart my database server, -normally- swap is zero. After some time swapping becomes large gradually. After 13 days, I get 650MB swap of MySQL. I want to find what causes this swapping and try to reduce the swap without performance grade. I want to be sure that the cause is cron queries or some other thing causes this swap size. My Edit: I restarted mysql server 2 days ago, so swap is low now. But with time passes it again will become bigger. When I make When I make My |
Modeling a database for easy counting / reporting Posted: 14 Oct 2013 08:26 PM PDT I have an app where user is known (user_id) and he can do several actions (action_id). Every time he makes an action I need to save the fact that he made it for reports/analytics. I guess it is similar to other analytic solutions and their db design. Once I have the data, provided with a time window (minutes resolution) I need to count for each user (all or some) the number of times he did actions and which actions he did. (sum all data grouped by action_id). Some assumptions:
I'm considering SQL, NoSQL and RRD to save the data. I put RRD here because it's easy to implement the insert of the data into statds+graphite. I'm concerned if I take this approach, the querying (although provided by graphite) will not be indexed and will probably have to count all the data whenever I ask for a window/user (no indexing). Another problem is that when querying all the data, all users info will be needed, resulting in reading all the files concurrently which I'm not sure is a good thing. SQL - Very easy implementation when inserting the data and querying. Easy to index, order and group by. However I'm not sure it's easy if I'm anticipating high traffic. Also, I'm not sure how effective is the count() of sql (haven't used SQL in the last few years) after group by. Can it offer parallel computation? NoSQL - Is there a solution out there that is the right fit for this type of scenario (perhaps a Map/Reduce algorithm to fast generation of counts in a time window?) Thanks for helping me model |
Putting a Select statement in a transaction Posted: 14 Oct 2013 01:26 PM PDT What is the difference between these 2 queries: And without transaction: What is the effect of having a If |
Problem compiling view when it is referencing a table in an other view: insufficient privileges Posted: 14 Oct 2013 11:26 AM PDT Oracle 11g R2 Logged on: SYS / AS SYSDBA When I try to compile or create a view that is referencing local schema tables. It works fine. Problem does occur when I try to compile the same view referencing a table in another schema like schema.table in my query. Oracle throws the exception ORA-01031: insufficient privileges. Remember I am using SYS account (sysdba). |
Posted: 14 Oct 2013 07:26 PM PDT I have a database in microsoft access. I want to know how to look up a singular datum from a reference table giving a dynamic set of values. Here is a representation of what I mean: I have the following tables: Points for Pushups(m): Fitness Tests: People: I want the query to use People.DOB and the Test date to find the age the person was during the test. I then want the query to use this value to determine which column to look in, and the value from reps to determine which row to look in coming back with the singular value and naming it points. for example I want bob to show Query: Does anyone know how to do the dynamic reference part? I know how to make the query and I know how to get age I just don't know how to use the values as columns in the reference table, I've seen it done, but long ago and never looked into it. |
SQL Server Sproc calls from IBM Cast Iron orchestration with XML payload fail Posted: 14 Oct 2013 12:13 PM PDT I'm hoping that someone on this forum may have experience with an IBM appliance, Cast Iron (Websphere), and how it interacts with SQL Server. I know very little about the Cast Iron appliance, and its capabilities, so excuse my ignorance. The scenario is that I have created a series of stored procedures that pass their payloads as well-formed XML as either an input or output parameter based on the type of call (get or set) (we've tried using well-formed XML as native Sample: Or The challenge is that the procedure call in Cast Iron's (editor) is limiting | truncating the well-formed XML at 4000 characters. Anyone have any experience with this scenario at all? One other bit of information, when "hooking up" the call to the stored procedure using parameters to pass the XML doc, the data type ( Any help is greatly appreciated! |
How to setup SQL active/active cluster to achieve Blue / Green instance switching? Posted: 14 Oct 2013 05:26 PM PDT I am wondering if anyone has ever used a multi-instance cluster (nee 'Active/Active') to achieve blue/green (or A/B) deployment scenarios, and what the best way of configuring it is (using SQL 2012 / Windows 2008 R2)? To be specific, the scenario I want to achieve is to be able to switch between which cluster instance is being connected to by clients without either the clients or the SQL instances knowing (I stress I'm not talking about node failover here). I'm envisaging that the best way to achieve this is something like:
This should hopefully enable me to do the following:
Joining the dots, it seems like this should be possible:
... but I've never seen a full example. Has anyone done it? Will what's proposed above work? What have I missed? |
Time series data for ad platform Posted: 14 Oct 2013 12:26 PM PDT I am trying to figure out how to store time series data for an ad platform I am working on. Basically I want to know some strategies/solutions for storing billions of rows of data so that I can easily search it (about 6-8 indexes on the table) and get fast counts based on queries. I tried mySQL with the tokuDB engine and this seems to be very fast but is extremely slow when I try to do a COUNT query when the rows reached about 5-8 million. I was looking at some noSQL alternatives but since I want to be able to search this data this is probably not the best solution. I was using dynamoDB. I would have had to store the data is many places in order to account for all the searching on the data. What I am storing is a row in the database for each click on an AD that occurs. This table will grow very fast, especially when this site gets large. Another solution would be to separate this data per advertiser. This means each advertiser will have their own table where all their data goes into. This means it will be much smaller and the COUNT queries will be much faster. I can even split it up by advertiser and month. My goal is to give an advertiser the ability to search and display in a paginated way all their clicks. They should be able to get data between a time period and filter by about 5-8 other indexes if they want to. |
If an account has REQUIRE SUBJECT, does it still need a password? Posted: 14 Oct 2013 06:26 PM PDT I'm in the process of setting up SSL-secured replication between two servers. Each server has its own public/private keypair, and the CA cert is just the concatenation of the two public certs, like this answer. Now I'm updating the replication account with REQUIRE SUBJECT "exact subject of the client" Is there any practical value to also having a password on the replication account (IDENTIFIED BY "secret")? |
How do I determine if a database is the principal in a mirroring setup? Posted: 14 Oct 2013 08:07 PM PDT I have two database servers When How do I solve this problem? |
Converting dbo file from windows to linux Posted: 14 Oct 2013 10:26 AM PDT I have a .dbo file which is created from windows. This file is succesfully reloaded into the mysql database in windows. I need to reload the dbo file into the mysql/mariadb database in linux. How I convert the file that was created from windows to linux? |
How to execute some script when database starts up Posted: 14 Oct 2013 07:32 PM PDT I want to execute some T-sql code, when a database starts up. the objective is to resume a session of mirroring in case the session is "suspended" when the database comes "back to life". So I've done this code: but how can I make it when the sql server starts up? I used this with no success: |
You are subscribed to email updates from Recent Questions - Database Administrators Stack Exchange To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |