Here's the general error I am receiving[quote]A fatal error occurred during a full-text population and caused the population to be cancelled. Population type is: AUTO; database name is XX(id: XX); catalog name is XX (id: XX); table name XX (id: xx). Fix the errors that are logged in the full-text crawl log. Then, resume the population. The basic Transact-SQL syntax for this is: ALTER FULLTEXT INDEX ON table_name RESUME POPULATION. [/quote]Additional information form the full-text crawl log displays[quote]Error: 30059, Severity: 16, State: 1. [/quote]Several sites report the fix as the following[quote]I have seen the issue occur when the SQL Full-text Filter Daemon Launcher service was running with the NT AUTHORITY\NETWORK SERVICE account. Switching the service to use the Local System account solved the issue. Then the Full Text Catalog would rebuild without any issue.[/quote]above quote from [url]http://blog.hoegaerden.be/2009/03/04/full-text-search-fatal-error-30059/[/url]I have found no explanation as to the relationship between the account running the full-text service and this error message. Furthermore I am running all services with Domain accounts and am still receiving the error.Any help on additional items I can check would be much appreciated.
↧
A fatal error occurred during a full-text population, Event ID: 30059
↧
Code script to alert some events
Hi,Please help me,I am writing a code that will be used to send email alerts for some events, the query run buts the mail generated is in blank, no query output, I did not see where I am making the mistake.Thanks for your ideas and supportDECLARE @Body NVARCHAR(MAX), @TableHead VARCHAR(1000), @TableTail VARCHAR(1000), @EmailSubject VARCHAR(300), @TableTail2 VARCHAR(1000), @statement VARCHAR(1000), @porcentage real, @comands varchar(32), @starttime datetime --'DECLARE @porcentual real'+','+'@comando varchar(32)'+','+'@inicio datetime' SET @EmailSubject = 'Verificacion del estado del Backup'; SET @TableTail = '</table></body></html>'; SET @Body ='if exists (select percent_complete, command, start_timefrom sys.dm_exec_requestswhere command in ('+'''BACKUP LOG ''' + ',' + '''BACKUP DATABASE''' + ',' + '''DBCC''' + ','+ '''SELECT''' +'))'+'select @porcentual=percent_complete, @comando=command, @inicio=start_time from sys.dm_exec_requests where command in ('+'''BACKUP LOG'''+','+'''BACKUP DATABASE'''+','+'''DBCC'''+','+'''SELECT'''+')'+'print '+'''Command: '''+'@comands'+''' started at: '''+'cast(@starttime as nvarchar(20))'+''' Porcentage:'''+ 'cast ( @porcentage as nvarchar(20))'SELECT @Body = @TableHead + ISNULL(@Body, '') + @TableTail EXEC msdb.dbo.sp_send_dbmail @profile_name = 'MonitorB', @recipients = 'lulu.ma123@xyz.com', @subject=@EmailSubject, @body=@Body , @body_format = 'HTML' ;
↧
↧
Need help with an error that blocks almost all DDL statementhe s server-wide
Hi everyone!I had an issue crop up this afternoon on a dev server that I have never seen before and has me completely stumped. I hope someone can point me in the right direction.The server itself is running Microsoft SQL Server Standard Edition (64-bit) version 10.50.2550.0 on Microsoft Windows NT 6.1 (7601), but the database I was working on is set to compatibility level 80 due to vendor software limitations.The database itself is part of an off the shelf platform, so we have limited control over how various business processes are implemented. I was developing an INSTEAD OF INSERT trigger to meet a new requirement from the users and started getting this error:[font="Courier New"]Cannot set XACT ABORT to OFF inside the trigger execution unless the database compatibility is 90.[color="#FF0000"]Msg 3616, Level 16, State 2, Procedure DummySproc, Line 3An error was raised during trigger execution. The batch has been aborted and the user transaction, if any, has been rolled back[/color].[/font]I should add that I couldn't find any SET XACT_ABORT statements in any procedure, trigger, function, etc anywhere.This happens when I try to DROP, ALTER or CREATE any kind of database object - tables, triggers, procedures, views, etc, but not just in the database I was working on, but across the entire server. Interestingly enough, CREATE DATABASE gives me this:[font="Courier New"]Msg 1807, Level 16, State 3, Line 1Could not obtain exclusive lock on database 'model'. Retry the operation later.Msg 1802, Level 16, State 4, Line 1CREATE DATABASE failed. Some file names listed could not be created. Check related errors.[/font]Finally, I have tried setting the compatibility level on the database to 90, and the same "Cannot set XACT ABORT to OFF" occurs, but the compatibility level [i]is successfully changed[/i].I have no idea what is causing DDL statements to fail server wide - I didn't even think that was possible. Any suggestions would be greatly appreciated, as I am not looking forward to explaining that the development server is completely FUBARed and that I have no idea why :)PS Dropping every database (if we could drop databases at all) is certainly an option, but reinstalling SQL Server likely is not.Thanks,DJ
↧
A wide range of sports were already established
[url=http://www.brownpapertickets.com/profile/2151577]New Zealand vs Argentina Live Streaming[/url]
↧
select Json
Hi,In my table I have a field with a json in, how can I take since I am the user Microsoft SQL Server 2008 R2 (RTM) - 10.50.1617.0 (X64)Thanks
↧
↧
controversial results, in which one or both fighters
[url=http://www.brownpapertickets.com/profile/2184184]Gennady Golovkin vs Kell Brook Live Streaming[/url]
↧
The first documented use of the name mixed martial
[url=http://www.brownpapertickets.com/profile/2181193]UFC 203 Live Streaming[/url]
↧
SS2K8 not compatible with windows 10
Just got a new PC with windows 2010 and cannot install my SQL Server 2008. Is it another Microsoft joke?Please, help!
↧
Bulk Insert without quotes
Hi I want to insert csv file in my Sql Server table, without quotesI m using this command [code="sql"] BULKINSERT [GFA_BG].[dbo].[StagingBG]FROM '...\AIMS_20160709.csv'WITH( FIELDTERMINATOR ='","', ROWTERMINATOR ='', FirstRow=2, DATAFILETYPE = 'char', FORMATFILE = 'AIMS_20160709.fmt', ERRORFILE = '...\errorlog.log' )GO[/code]this is the formatfile :[code="other"]10.0251 SQLCHAR 0 0 "\"" 0 FIRST_QUOTE SQL_Latin1_General_CP1_CI_AS2 SQLCHAR 0 12 "\",\"" 1 I_CODE SQL_Latin1_General_CP1_CI_AS3 SQLCHAR 0 64 "\",\"" 2 LEGAL_NAME SQL_Latin1_General_CP1_CI_AS4 SQLCHAR 0 64 "\",\"" 3 TRADING_NAME SQL_Latin1_General_CP1_CI_AS5 SQLCHAR 0 64 "\",\"" 4 COUNTRY SQL_Latin1_General_CP1_CI_AS6 SQLCHAR 0 3 "\",\"" 5 CURRENCY SQL_Latin1_General_CP1_CI_AS7 SQLCHAR 0 3 "\",\"" 6 LANGUAGE SQL_Latin1_General_CP1_CI_AS8 SQLCHAR 0 1 "\",\"" 7 STATUS SQL_Latin1_General_CP1_CI_AS9 SQLCHAR 0 64 "\",\"" 8 BANK_NAME SQL_Latin1_General_CP1_CI_AS10 SQLCHAR 0 16 "\",\"" 9 BANK_GUARANTEE_AMOUNT SQL_Latin1_General_CP1_CI_AS11 SQLCHAR 0 3 "\",\"" 10 BANK_GUARANTEE_CURRENCY SQL_Latin1_General_CP1_CI_AS12 SQLDATE 0 3 "\",\"" 11 BANK_GUARANTEE_EXPIRY_DATE ""13 SQLDATE 0 3 "\",\"" 12 ACCREDITATION_DATE ""14 SQLCHAR 0 1 "\",\"" 13 CLASS_PAX_OR_CGO SQL_Latin1_General_CP1_CI_AS15 SQLCHAR 0 2 "\",\"" 14 LOCATION_TYPE SQL_Latin1_General_CP1_CI_AS16 SQLCHAR 0 12 "\",\"" 15 XREF SQL_Latin1_General_CP1_CI_AS17 SQLINT 0 4 "\",\"" 16 IRRS ""18 SQLCHAR 0 16 "\",\"" 17 TAX_CODE SQL_Latin1_General_CP1_CI_AS19 SQLCHAR 0 2 "\",\"" 18 COUNTRY_CODE SQL_Latin1_General_CP1_CI_AS20 SQLCHAR 0 64 "\",\"" 19 CITY SQL_Latin1_General_CP1_CI_AS21 SQLINT 0 4 "\",\"" 20 DEF ""22 SQLCHAR 0 1 "\",\"" 21 OWN_SHARE_CHANGE SQL_Latin1_General_CP1_CI_AS23 SQLDATE 0 3 "\",\"" 22 OWN_SHARE_LAST_DATE ""24 SQLCHAR 0 50 "\",\"" 23 CHO_CHI SQL_Latin1_General_CP1_CI_AS25 SQLCHAR 0 50 "\"\r" 24 DEF_NONPAYMENT SQL_Latin1_General_CP1_CI_AS[/code]and this is the .csv file [code="other"]"I_CODE","LEGAL_NAME","TRADING_NAME","COUNTRY","CURRENCY","LANGUAGE","STATUS","BANK_NAME","BANK_GUARANTEE_AMOUNT","BANK_GUARANTEE_CURRENCY","BANK_GUARANTEE_EXPIRY_DATE","ACCREDITATION_DATE","CLASS_PAX_OR_CGO","LOCATION_TYPE","XREF","IRRS","TAX_CODE","CITY","ISO_CTRY_CODE","DEF","OWN/SHARE CHANGE","OWN/SHARE LAST DATE","CHO_CHI","DEF_NONPAYMENT""97500023","CARIBBEAN WORLD ","GOING PLACES","ANTIGUA AND BARBUDA","XCD","ENG",9,"",,"","","19-OCT-50","P","BR","98500010","0","","ST. JOHN'S","AG","0","","","",""[/code]I have these message errors :[quote]Msg 4863, Level 16, State 1, Line 2Bulk load data conversion error (truncation) for row 2, column 7 (LANGUAGE).Msg 4832, Level 16, State 1, Line 2Bulk load: An unexpected end of file was encountered in the data file.Msg 7399, Level 16, State 1, Line 2The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.Msg 7330, Level 16, State 2, Line 2Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".[/quote]and the log file :[quote]Row 2 File Offset 351 ErrorFile Offset 0 - HRESULT 0x80004005[/quote]please help!
↧
↧
Maintenence Plans - Quick Question
Hire: Maintenance PlansDo MS recommend doing maintenance plans on a DB ? Just wondered whether they recommend a some kind of minimum etc.We have several DB's where we don't have any maintenance plans. How easy it to check whether a table is above the 30% threshold(I think that is the threshold - correct me if I'm wrong) for a re-org ?Thanks for your help in advance.
↧
Log backups
Dear all,It seems that I need to change my logs backps from 1h (occur every one hour) to half an hour as the logs are getting bigger.Can I have any negative impacted due to this in terms of performance? Lock, block, etc..?Thanks a lot
↧
Log shipping between 2008 and 2012
Hi,I have been looking at log shipping for the first time to fulfil a warm reporting solution and would like to implement this between 2008 and to 2012 which is implied in the below article to be possible.http://www.sqlservercentral.com/Forums/Topic1499395-2799-1.aspxHaving followed the above steps it 'nearly' seems to work but the log files are skippedot applied as per message belowSkipped log backup file. Secondary DB: 'TestLS2', File: 'G:\Log Shipping\CON-POL-SAGE-01_TestLS\TestLS2_________.trn'Could not find a log backup file that could be applied to secondary database 'TestLS2'.Any ideas if this can be achieved?Thanks in advanceLaurence
↧
A Group by with a where clause ??
HiI am using the code below to group by client and score I need to not count if the field dbo.[BPRS Scores].Score = 33Thanks in advance SELECT TOP (100) PERCENT SUM(dbo.[BPRS Scores].Score)-COUNT(dbo.[BPRS Scores].Score) AS Expr1, dbo.e_ClientCases.Client, dbo.BPRS.ObjectID, dbo.e_CalendarEvents.StartTime,ROW_NUMBER() over (Partition by edata.dbo.e_ClientCases.Client order by dbo.e_CalendarEvents.StartTime) FROM dbo.BPRS INNER JOIN dbo.[BPRS Scores] ON dbo.BPRS.ObjectID = dbo.[BPRS Scores].Parent INNER JOIN dbo.e_ObjectMetadata ON dbo.BPRS.ObjectID = dbo.e_ObjectMetadata.ObjectID INNER JOIN dbo.e_ClientCases ON dbo.e_ObjectMetadata.[Case] = dbo.e_ClientCases.ObjectID LEFT OUTER JOIN dbo.e_CalendarEvents ON dbo.BPRS.ObjectID = dbo.e_CalendarEvents.ParentObject GROUP BY dbo.BPRS.ObjectID, dbo.e_ClientCases.Client, dbo.e_CalendarEvents.StartTimeORDER BY dbo.e_ClientCases.Client
↧
↧
bulk insert date not working
HI , I want to insert rows in a table from csv file, this is my csv file:[quote]"I_CODE",LEGAL_NAME,"TRADING_NAME","COUNTRY","CURRENCY","LANGUAGE","STATUS","BANK_NAME","IRRS","BANK_GUARANTEE_DATE","BANK_GUARANTEE_AMOUNT""1001",TV,"Sony","ALLEMANGNE","USD","ENG","9","BANKNAME1","98","19-jan-78","98USD"[/quote]and this is my format file :[quote]10.0121 SQLCHAR 0 0 "\"" 0 FIRST_QUOTE SQL_Latin1_General_CP1_CI_AS2 SQLCHAR 0 12 "\"," 1 I_CODE SQL_Latin1_General_CP1_CI_AS3 SQLCHAR 0 64 ",\"" 2 LEGAL_NAME SQL_Latin1_General_CP1_CI_AS4 SQLCHAR 0 64 "\",\"" 3 TRADING_NAME SQL_Latin1_General_CP1_CI_AS5 SQLCHAR 0 64 "\",\"" 4 COUNTRY SQL_Latin1_General_CP1_CI_AS6 SQLCHAR 0 3 "\",\"" 5 CURRENCY SQL_Latin1_General_CP1_CI_AS7 SQLCHAR 0 3 "\",\"" 6 LANGUAGE SQL_Latin1_General_CP1_CI_AS8 SQLCHAR 0 1 "\",\"" 7 STATUS SQL_Latin1_General_CP1_CI_AS9 SQLCHAR 0 64 "\",\"" 8 BANK_NAME SQL_Latin1_General_CP1_CI_AS10 SQLINT 0 4 "\",\"" 9 IRRS ""11 SQLDATE 0 4 "\",\"" 10 BANK_GUARANTEE_DATE ""12 SQLCHAR 0 50 "\r" 11 BANK_GUARANTEE_AMOUNT SQL_Latin1_General_CP1_CI_AS[/quote]SQLDATE NOT WORKING, my input date = 19-jan-78 but I want it 1978-01-19 is there a way to do it ?ps : When I put just SQLINT (and remove the date), my value "98" become "14393" in the table this is my code :[quote] BULK INSERT [StagingBG]FROM '...\TEST\test1.csv' WITH ( FIRSTROW = 2, FORMATFILE='...\TEST\test1.fmt' );[/quote]what am I missing in my configuration ?help!
↧
Question on Connect Timeout in Connection String
In the config file of a .net application I have this key used to store a connection string:<add key="ConnectionString" value="server=*********;Trusted_Connection=YES;database=**********;Integrated Security=true;Connect Timeout=0"/>I'm under the impression that 'Connect Timeout=0' means 'take as long as you need to connect'.If I publish that web site to a Windows 2008 R2 (web) server running IIS 7.5 which connects to a separate server running SQL Server 2008 - the web site stops responding whenever more than 2 people are browsing it.If I open the web site in a browser, move to the desk next to mine and open it on someone else's PC - it will work fine. If I move to a third PC and try to open the web site in a browser - the browser just hangs - nothing happens. There are no error messages and nothing is raised in the server event logs.I know absolutely by a process of elimination that it is the connection to the database that is causing the problem. If I remove all code on my home page that retrieves data from the database - the site works perfectly with any number of people able to browse it.If I change it to 'Connect Timeout=15' it seems to fix the problem (so far, at least). So, my question is .... does 'Connect Timeout=0' mean 'take as long as you like to connect to the database' or not? What else can it mean - you surely can't expect a connection to be instantaneous?
↧
Urgent!!Need to write a sql script
I have the following sql query:declare @naiccode int = 12345SELECT Policy.policyType, Policy.policyNum, Policy.effectiveDate, '12345' AS "NAIC Code" , Insured.lname1, Insured.middle1,Insured.fname1, Insured.address1, Insured.city, Insured.state, Insured.zip FROM Policy FULL JOINInsured ON Insured.insuredID = Policy.insuredID WHERE Policy.policyType = 1;I want to write the results of this query to a fixed width .txt file. The blanks in between the fields should be blank spaces.The file structure is as follows:Field Name1 = Policy TypeLength = 2Begin = 1End =2Type = alpha numericField name2 = NAIClength = 5Begin = 3End = 7Type = NumericField 3 = policy numberlength = 30begin = 8end = 37Field Name 4 = Datelength = 8begin =38end = 45Type = numeric
↧
Run out dates
I have a large(440k rows) pivoted csv report generated by a third party with fields Item, Location, and then forward projecting dates with quantities by week in the formatItem,Location,[w2016 09 12] ,[w2016 09 19] ,[w2016 09 26] ,[w2016 10 03] ,[w2016 10 10]... etcso the data looks like;Item Location w2016 09 12 w2016 09 19 w2016 09 26 w2016 10 03 w2016 10 10 w2016 10 17 w2016 10 24162386-00 CHB 263 203 143 83 23 0 0162386-00 GRR 44 44 44 44 44 44 44162386-00 MAR 0 0 0 0 0 0 0162387-00 CHB 83 83 83 83 83 83 83162387-00 GRR 22 12 2 0 0 0 0162387-00 MAR 0 0 0 0 0 0 0162388-00 CHB 49 44 39 34 29 24 19162388-00 GRR 30 20 10 0 0 0 0162388-00 MAR 0 0 0 0 0 0 0162389-00 CHB 90 80 70 60 50 40 30[b]What I need to do is find the latest date where the qty goes <= 0 for each item/location. [/b][i]What I have so far:[/i][quote]SELECT Item,[Location],[Weeks],min([Week]) FROM( select Item,[Location] ,[w2016 09 12] ,[w2016 09 19] ,[w2016 09 26] ,[w2016 10 03] ,[w2016 10 10] ,[w2016 10 17] ,[w2016 10 24] ,[w2016 10 31] ,[w2016 11 07] ,[w2016 11 14] ,[w2016 11 21] ,[w2016 11 28] ,[w2016 12 05] ,[w2016 12 12] ,[w2016 12 19] ,[w2016 12 26] ,[w2017 01 02] ,[w2017 01 09] ,[w2017 01 16] ,[w2017 01 23] ,[w2017 01 30] ,[w2017 02 06] ,[w2017 02 13] ,[w2017 02 20] ,[w2017 02 27] ,[w2017 03 06] ,[w2017 03 13] ,[w2017 03 20] ,[w2017 03 27] ,[w2017 04 03] ,[w2017 04 10] ,[w2017 04 17] ,[w2017 04 24] ,[w2017 05 01] ,[w2017 05 08] ,[w2017 05 15] ,[w2017 05 22] ,[w2017 05 29] ,[w2017 06 05] ,[w2017 06 12] ,[w2017 06 19] ,[w2017 06 26] ,[w2017 07 03] ,[w2017 07 10] ,[w2017 07 17] ,[w2017 07 24] ,[w2017 07 31] ,[w2017 08 07] ,[w2017 08 14] ,[w2017 08 21] ,[w2017 08 28] ,[w2017 09 04] ,[w2017 09 11] ,[w2017 09 18] ,[w2017 09 25] ,[w2017 10 02] ,[w2017 10 09] ,[w2017 10 16] ,[w2017 10 23] ,[w2017 10 30] ,[w2017 11 06] ,[w2017 11 13] ,[w2017 11 20] ,[w2017 11 27] ,[w2017 12 04] ,[w2017 12 11] ,[w2017 12 18] ,[w2017 12 25] ,[w2018 01 01] ,[w2018 01 08] ,[w2018 01 15] ,[w2018 01 22] ,[w2018 01 29] ,[w2018 02 05] ,[w2018 02 12] ,[w2018 02 19] FROM ProjectedActualAvailable)as PAA UNPIVOT ( [WEEK] for [Weeks] IN ( [w2016 09 12] ,[w2016 09 19] ,[w2016 09 26] ,[w2016 10 03] ,[w2016 10 10] ,[w2016 10 17] ,[w2016 10 24] ,[w2016 10 31] ,[w2016 11 07] ,[w2016 11 14] ,[w2016 11 21] ,[w2016 11 28] ,[w2016 12 05] ,[w2016 12 12] ,[w2016 12 19] ,[w2016 12 26] ,[w2017 01 02] ,[w2017 01 09] ,[w2017 01 16] ,[w2017 01 23] ,[w2017 01 30] ,[w2017 02 06] ,[w2017 02 13] ,[w2017 02 20] ,[w2017 02 27] ,[w2017 03 06] ,[w2017 03 13] ,[w2017 03 20] ,[w2017 03 27] ,[w2017 04 03] ,[w2017 04 10] ,[w2017 04 17] ,[w2017 04 24] ,[w2017 05 01] ,[w2017 05 08] ,[w2017 05 15] ,[w2017 05 22] ,[w2017 05 29] ,[w2017 06 05] ,[w2017 06 12] ,[w2017 06 19] ,[w2017 06 26] ,[w2017 07 03] ,[w2017 07 10] ,[w2017 07 17] ,[w2017 07 24] ,[w2017 07 31] ,[w2017 08 07] ,[w2017 08 14] ,[w2017 08 21] ,[w2017 08 28] ,[w2017 09 04] ,[w2017 09 11] ,[w2017 09 18] ,[w2017 09 25] ,[w2017 10 02] ,[w2017 10 09] ,[w2017 10 16] ,[w2017 10 23] ,[w2017 10 30] ,[w2017 11 06] ,[w2017 11 13] ,[w2017 11 20] ,[w2017 11 27] ,[w2017 12 04] ,[w2017 12 11] ,[w2017 12 18] ,[w2017 12 25] ,[w2018 01 01] ,[w2018 01 08] ,[w2018 01 15] ,[w2018 01 22] ,[w2018 01 29] ,[w2018 02 05] ,[w2018 02 12] ,[w2018 02 19] ) ) as up group by Item,[Location] order by Item,[Location],[Weeks] [/quote]
↧
↧
Transform transactions into report without using cursors
I have a table that contains a list of holders with the amount held at the end of the day. The table therefore contains at most one entry per holder per day. I want to produce a report that would display each holder with a start date and an end date. You can test with:DECLARE @Holdings TABLE (EffectiveDate DATETIME, Holder VARCHAR(10), Amount INT)INSERT INTO @Holdings (EffectiveDate, Holder, Amount) VALUES( '2001-01-01', 'John', 10), --John becomes a holder( '2001-01-02', 'John', 15), --John is still a holder (no change)( '2001-01-02', 'Jane', 10), --Jane becomes a holder( '2001-01-03', 'John', 0 ), --John ceases to be a holder( '2001-01-04', 'John', 10), --John becomes a holder( '2001-01-05', 'Jane', 0 ) --Jane ceases to be a holderDECLARE @Report TABLE (Holder VARCHAR(10), StartDate DATETIME, EndDate DATETIME)DECLARE @EffectiveDate DATETIME, @Holder VARCHAR(10), @Amount INTDECLARE cur CURSOR FOR SELECT EffectiveDate, Holder, Amount FROM @Holdings ORDER BY EffectiveDateOPEN curFETCH cur INTO @EffectiveDate, @Holder, @AmountWHILE @@FETCH_STATUS = 0 BEGIN IF @Amount <> 0 AND NOT EXISTS(SELECT NULL FROM @Report WHERE Holder = @Holder AND EndDate IS NULL) INSERT INTO @Report SELECT @Holder, @EffectiveDate, NULL IF @Amount = 0 UPDATE @Report SET EndDate = @EffectiveDate WHERE Holder = @Holder AND EndDate IS NULL FETCH cur INTO @EffectiveDate, @Holder, @AmountENDCLOSE curDEALLOCATE curSELECT * FROM @Report ORDER BY StartDate, Holderand the result is:John 2001-01-01 2001-01-03Jane 2001-01-02 2001-01-05John 2001-01-04Does someone have a brilliant idea so the same result would be efficiently achieved without the use of cursors?
↧
asyncronous sql queries?
I'm working on a sproc that needs to run several independent select statements. So let's say I have 3 select statements like this:select * from x into #xselect * from y into #yselect * from z into #zLet's say each select statement takes 2s to complete. If I run the sproc as shown above then the total query time would be 6s. However, if I was able to run the queries asyncronously/simultaneously then I could exec the sproc in 2s total. Is there a way to do something like this in T-SQL?:select * from x into #x asyncselect * from y into #y asyncselect * from z into #z async
↧
is there a way to select column data for columns with an ordinal position < a particular ordinal position?
Is there a way to select column data for columns with an ordinal position < a particular ordinal position? I'm considering using dynamic sql to return data from 10 different tables but I want to exclude the last 4 columns (created/updated) from the select statement. So maybe something like this:declare @maxordinal intselect @maxordinal getordinal(createdate) - 1 from mytableselect * from mytable where getordinal(getcolumn) < @maxordinalIs there a way to do this or an alternative way to accomplish what I'm trying to do here?
↧