Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to keep connection Presistent? #2

Open
AdamSEY opened this issue Feb 21, 2018 · 8 comments
Open

how to keep connection Presistent? #2

AdamSEY opened this issue Feb 21, 2018 · 8 comments

Comments

@AdamSEY
Copy link

AdamSEY commented Feb 21, 2018

I would like to keep connection Persistent is that possible using your library?

		$opts     = array(
			PDO::ATTR_PERSISTENT         => true, // use existing connection if exists, otherwise try to connect
		);

Also is it a good option to set?

@ocram
Copy link
Contributor

ocram commented Feb 22, 2018

Sure, it’s possible. As you can see in the README, there are three ways to create database instances with this library.

One of these methods allows you to pass in an existing PDO instance. That way, you can just create that PDO instance with a persistent connection yourself. This must be done in the constructor:

$myPdo = new \PDO('mysql:host=localhost;dbname=test', $user, $pass, [
	\PDO::ATTR_PERSISTENT => true
]);

Next, you simply create an instance of this library from your PDO instance:

$db = \Delight\Db\PdoDatabase::fromPdo($myPdo);

That’s it.

We could also add support for this to the

$db = \Delight\Db\PdoDatabase::fromDataSource($myDataSource);

method, which doesn’t support this yet.

Whether enabling persistent connections is something that you should do is a different question, of course. In general, I don’t think you should. First check if establishing new connections is really the bottleneck in your application. It usually isn’t. Instead, it’s often the expensive database queries that you may perform afterwards, or something not related to the database at all, e.g. heavy computations.

Is your database on the same machine that the web server and PHP run on? Or is it a remote computer where connecting to is more expensive? What kind of database server is it? Does creating connections incur a high overhead?

Also note that persistent connections have some drawbacks that could result in major problems for your application if it isn’t designed to properly work with those connections:

There are a couple of additional caveats to keep in mind when using persistent connections. One is that when using table locking on a persistent connection, if the script for whatever reason cannot release the lock, then subsequent scripts using the same connection will block indefinitely and may require that you either restart the httpd server or the database server. Another is that when using transactions, a transaction block will also carry over to the next script which uses that connection if script execution ends before the transaction block does. In either case, you can use register_shutdown_function to register a simple cleanup function to unlock your tables or roll back your transactions. Better yet, avoid the problem entirely by not using persistent connections in scripts which use table locks or transactions (you can still use them elsewhere).

http://php.net/manual/en/features.persistent-connections.php

Your database may also have its own way of doing connection pooling, which is generally preferable.

All that being said, only use persistent connections if you checked the impact of creating new database connections (excluding database operations performed thereafter) and the overhead turned out to be high.

@AdamSEY
Copy link
Author

AdamSEY commented Feb 22, 2018

Thank you so much for such a great help. Yeah, my DB and web server are on the same machine. I'm using Percona MySQL fork with InnoDB engine. at the moment there's no overhead but maybe later with much more customers.
here's my use case if you're interested to know more:

I'm trying to keep logged user information up-to-date. So after they logged in to my website, every second we make a query to get user unread tickets, balance etc..

If I've 10,000 connected user, we will send 600 000 query every minute. isn't better to make a persistent connection? or maybe I have to change my technique. Any suggestions?

@ocram
Copy link
Contributor

ocram commented Feb 22, 2018

Polling done once every second for every single user is certainly a lot. I guess you want to use JavaScript to send an AJAX/XHR request to the server every second, which responds with up-to-date data, right?

You could use long polling instead, i.e. send a request that the server keeps open (and does not yet answer) until new data is available, after which the server then finally sends its response. This saves requests, but at the cost of keeping connections to the server open much longer. PHP (and Apache) are not really well-suited for this, though. Something like Node.js works better here – which doesn’t have to be a problem, because you could do only that part in Node.js and the rest in PHP as usual.

Another alternative would be WebSockets. These are much more efficient than frequent polling, but, again, are usually done in something like Node.js rather than PHP.

Persistent connections would certainly be another solution. They’re fine, just remember that you have to be very careful when locking tables or using transactions, which require correct clean-up in case of scripts that don’t terminate as expected.

Anyway, even starting without persistent connections (or any other of these solutions) should be fine, and it should not be too hard to enable persistent connections later when the need arises.

You could also consider polling only every three seconds, which users will hardly notice, but which could easily reduce your load to a third.

No matter what solution you use, you can only reduce the number of new connections being created. You’ll still have to do 10,000 database queries per second, either on fresh database connections or on old (persistent) database connections. These are still 10,000 queries.

@AdamSEY
Copy link
Author

AdamSEY commented Feb 23, 2018

Thanks Again, Nodejs looks a good solution. I should consider using it for that purpose.

@AdamSEY
Copy link
Author

AdamSEY commented Feb 23, 2018

@ocram
Copy link
Contributor

ocram commented Feb 23, 2018

Yes, that looks good.

It’s not much more efficient because you still have to do 1 SQL query per second for each of your (possibly 10,000) users. But there’s no way around that. This is as good as you can get with MySQL.

What you’re saving here is the extra HTTP requests and database connections. If you don’t need that, you can just use a basic PHP solution without persistent connections.

@AdamSEY
Copy link
Author

AdamSEY commented Feb 24, 2018

What I'm thinking about is making a WebSocket connection between MySQL database and client.
Assume the following:

  • Client browses their profile page.
  • We will give that client a unique ID.
  • background process changes a record on MySQL table.
  • We will provide the client ID to MySQL Socket, so the user gets notified of that change.
  • MySQL Socket is looking for connected client ID and sends that record.

If we could do something like that, we will save massive cost.

@ocram
Copy link
Contributor

ocram commented Feb 24, 2018

The problem is that MySQL doesn’t support external triggers, i.e. it doesn’t allow you to send data to PHP (or Node.js) directly. Instead, you have to do polling, i.e. actively and repeatedly check the database for new data. However, if the process that updates the database (i.e. PHP or Node.js) can trigger some notification when the update is being made, you could have an extremely efficient solution with WebSockets and without polling.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants