Sybase ASEBulkCopy is not working - sybase-ase

Sybase ASEBulkCopy is not working.
I have set the EnableBulkLoad attribute to 1 in the connection string.
It is uploading 1 record at a time even after setting the batch size to 500. The other settings EnableBulkLoad attribute is set to 1 in the connection string.
What other settings am I missing.
Please someone help me with this.
Thanks in advance.

Whether bulk load actually happens depends on other things as well, such as the presence of indexes on the target table. By enabling bulk load you're basically telling the ASE server that it should try to do bulk uploading if it can -- but maybe it cannot so it uses non-bulk.
I'm not sure I understand the details of your question though. What do you mean by "upload"? Does your client app send only 1 record to the ASE server at a time?
Or does it mean that ASE performs regular inserts instead of bulk inserts? If the latter, how did you diagnose that?
I recommend trying it first with the 'bcp' client utility to figure out if bulk loading is possible to start with.

Related

Locking a SQL Server Database with PHP

I'm wanting extra security for a particular point in my web app. So I want to lock the database (SQL Server 2005). Any suggestions or is this even necessary with SQL Server?
Edit on question:
The query is failing silently with no errors messages logged, and does not occur inside of a transaction.
Final Solution:
I never was able to solve the problem, however what I wound up doing was switching to MySQL and using a transactional level query here. This was not the main or even a primary reason to switch. I had been having problems with SQL Server and it allowed me to have our CMS and various other tools all running on the same database. Previous we had a SQL Server and a MySQL database running to run our site. The port was a bit on the time consuming however in the long run I feel it will work much better for the site and the business.
I suppose you have three options.
Set user permissions so that user x can only read from the database.
Set the database into single user mode so only one connection can access it
sp_dboption 'myDataBaseName', single, true
Set the database to readonly
sp_dboption 'myDataBaseName', read only, true
I never was able to solve the problem, however what I wound up doing was switching to MySQL and using a transactional level query here. This was not the main or even a primary reason to switch. I had been having problems with MSSQL and it allowed me to have our CMS and various other tools all running on the same database. Previous we had a MSSQL and a MySQL database running to run our site. The port was a bit on the time consuming however in the long run I feel it will work much better for the site and the business.

Access - Prevent Database Size Growth

I am using a MS Access 2013 light application that was developed by a third party. I did not do the coding/design/management of the project, but I am responsible for implementation for my team. I also do not have the option of switching to another solution, but I do have access to the vba code so I can make tweaks to clean up their mess.
My problem is this:
Set up application with my data (a-ok).
Run the built in, fairly complex third party macro.
For most cases, things are just fine... but when running it on a
larger dataset the filesize of the Access file exceeds 2GB and the
entire operation fails.
On fail, the process has to be restarted. For the same data set,
it fails each and every time it reaches approximately 55% complete.
I am unable to complete my work because of this. :|
Solutions tried:
Compact and repair - Fine when it fully executes, but the issue is that it reaches 2GB while the macro is running and cannot be interrupted.
Splitting the database - Splits OK, but doesn't fix issue.
Attempting to trigger a compact and repair inside the macro during the loop - Fails because Access cannot lock the database.
Desired solution:
A way to prevent the file growth/bloat while the macro is running. Either through a compartmentalization of the process or through some other wizardry I am unaware of at this time.
A solution that does not require extensive reconfiguration of the underlying code. I can deal with inefficient - so long as I can fix this issue for this one instance (1 critical error in 44 runs of different data in the database.
Any help?
I would recommend compact on close for easy dirty solution
On the File tab, click Options.
In the Access Options dialog box, click Current Database.
Under Application Options, select the Compact on Close check box.
ADVANCED SOLUTION
The other solution requires splitting the database.
After splitting you have another option.
Use the front with a sql server (check which version is suitable for you, I think the lite (free) version is enough to start with if you don't expect a hige amount of data)
Split the database
Install sql server (mysql or sql server express edition)
Create all tables in the sql server
Link the front to the sql server
I think davejal pretty much nailed this one.
If you have a handful of really large tables, you can put those into another Access DB, and make a link to those.
https://support.office.com/en-us/article/Import-or-link-to-data-in-another-Access-database-095ab408-89c7-45b3-aac2-58036e45fcf6
The 2GB limit is per DB.
Or, upgrade to SQL Server Express for free, and use Access as a front end to that SQL Server backend.
SQL-Server Backend, MS Access Frontend: Connection
Here's a link to get SQL Server Express.
https://www.microsoft.com/en-us/download/details.aspx?id=42299

PHP Database update slowing down connection from embedded device

I have an Embedded system, i.e. basically an ATMEGA based microcontroller with GSM Module. The GSM module exploits GPRS connection of the SIM to send GET request to a Webpage on my server. In simpler words, it is same as a person opening that webpage from his mobile device.
When that webpage is opened, nothing special happens, I just extract the GET parameters and update the database. Now the problem comes. The database is online on a GoDaddy Server and when I send that update request from GSM device, it hangs for 4-5 seconds. Is there any other way by which I can update database and save my time ?
Moreover, I would like to know, for online database, what takes more time,
* Initiating a database connection, or
* using an UPDATE query to update the table ?
There are a lot of things going on here and you may have issues in many places. A bit too little info to solve the issue but here are some possible places to look.
Obviously, you have the issue of network latency and general response time from your web/database server(s) on GoDaddy. My first question would be how does a response from the MC compare to a get call via a web pages.
To specifically answer your question - initiating a database connection is usually the most costly part of the transaction. I am not sure what you are using on the database side so I cannot point you to specific resources. I am guessing MySQL? If so take a peek at https://dba.stackexchange.com/questions/16969/how-costly-is-opening-and-closing-of-a-db-connection for suggestions. On my own database I tend to tune them for performance. On GoDaddy you may be quite limited in what you can do.
However, I am going to qualify what I said above a little bit. Generally an update to a database should not be that slow. We could be dealing with poor database design or very large tables that have to have indexes updates as well. Again something to think about in your particular case. The other item to note is that you may be doing updates as shown below:
update myTable set myField = 1 where somesensor = 'a';
update myTable set myField = 1 where somesensor = 'b';
update myTable set myField = 1 where somesensor = 'c';
.....
and depending on the number of updates you are doing, how you are making the connection, etc. and the rest of your particular situation..... If you are using MySQL take a look at this example How to bulk update mysql data with one query? for possible ideas. Benchmark this!
I would suggest doing an explain plan to see what is happening to see if you can id where the problem is (check your version of MySQL). See http://dev.mysql.com/doc/refman/5.7/en/explain.html for syntax, etc.
There really is not enough info to say exactly but maybe this will give you some ideas. Good luck!

Aborting a Select Query if it Takes Too long

I'm having a web application written in PHP.
One function of this application is a document archive which is a MySQL database on another server. And this archive server is pretty unreliable performance wise, but not under my control. The archive server has got often long table locks often, which results in getting a connection, but not getting any data.
This often leads to open MySQL connections which saturate the resources of the web-application server. As a result the whole web application becomes slow/inacessible.
I would like to decouple the two systems.
I thought the logical way would be for my PHP application to abort a SELECT query if it takes longer than 1 or 2 Seconds to free up resources and present the user with a message that the remote system is not responding in time.
But how is it best to implement such a solution?
UPDATE: the set_time_limit() option looks Promising. but not fully satisfying as im not able to present the user with an "message" but at least it might help to prevent the saturation of the Resources.
I think you should use maximum execution limit function provided in php.
You can set MySQL time out like this
Or you can set it on code like this
I think second solution might be better for you
Then if the timeout error raised, you can tell the server not responded

My php Scripts is very slow at server

i'm testing an app hosted at: app.promls.net, but there is some mistake on the script execution, on localhost
takes only -> timer: 0.12875008583069 seconds. .
in the execution when is just plain text that are created via php.
and when content is created dinamically and cames from mysql database:
timer: 0.44203495979309 seconds. /timer: 0.65762710571289 seconds. / timer: 0.48272085189819 seconds.
the times are diferent on the server. takes like 8 seconds on execution.
does anyone could give me a recomendation of how test and optimize my php execution.
i was optimizing the mysql database, cause some querys returns a tons of rows for a simple search, using describe and explain.
but know i have finished, and i would like to explore some new options for php execution.
i know that adding compression to html helps, but it only help on time of trasportation between server and final host when returns an html response. know i want to optimize php execution and if there are some tricks on mysql that could be implemented to help me improve the time response better.
note: i have thinking in use the hiphop for php and memcache or cassandra. but i guess those thinks are not re result for problem, cause i have no activities( means user actions) and no much information on my app.
thanks in advance i'm available for any comments or suggestions.
with such a big difference on execution we would need details on the host configuration (shared? dedicated?).
Is mysql skipping DNS test? if not try using skip-name-resolve setting in my.cnf, or use IP and not DNS in the PRIVILEGE query/user table, the only time I have seen such latency it was because of DNS timeout in the connection between MySQL and PHP.
First, off, try doing the following to your MYSQL DB:
Run "OPTIMIZE TABLE mytable" on all of your tables
Run "ANALYZE TABLE mytable" on all of your tables.
Add indexes to the table fields that you are using
Be sure to substitute each table name for "mytable" in the above statements.
See if doing the first two makes a difference, then add the indexes.

Resources