This may not be the best solution, but I am writing this post as per my knowledge...
Temporary tables and batch wise query executions are the simple techniques to process millions of records with minimum delay when sending large number of push notifications to ios and android devices.
Lets take an example. Our server side app task is to send 500k push notifications to ios devices. When users receive the notifications, they might click on it, eventually they launch the application on their devices. Our client side app will give them a bonus of 5/10 coins to them when they launch the application on their device after getting the notifications, also bonus will be available only for 1 or 2 hours, bonus will have to expire after 1 or 2 hours. There is one more, The server side application must send push notifications on daily basis and only to users with certain conditions are met.
Lets get the key points of this server side and client side applications
1) Send 500k push notifications from server side.
2) Send push notifications on daily basis.
3) Give bonus to users when they click on the push notifications
4) Bonus expire after 1/2 hours.
We are not going to discuss how the client side app is made and how to send large number of push notifications. To know more about push notifications, visit http://ourjavaknowledge.blogspot.in/2013/06/push-notification.html
We focus only on the server side implementation to process huge number of database records.
Lets take USER table that holds a million records. To process these many records, fetch them in batch wise manner. Even if you have around 32 or 64GB RAM, its safe to process them in batch manner. If you have more memory, increase the batch size. But don't try to process millions of records in one go, if you have a very limited amount of memory.
Now, lets consider our batch size is 50K. We cannot fetch and process the first 50k records straight away from user table. Because requirement says that we have to send notifications only on daily basis and only to users with certain conditions are met. Lets put a constraint here that its impossible to write a query to fetch 50K records with those conditions. So we have to filter the users one by one. Fetch first 50K records and process records one by one and put their primary key id's in List( java is my back end language, so the preferred data structure to hold unlimited data is List)
1) Fetch first 50K records
SELECT * FROM user LIMIT 0, 50000
2) Insert all users primary key id's in to List after filtering records one by one
3) Fetch and insert all records in to TEMPORARY table.
CREATE TEMPORARY TABLE user_temp AS (SELECT * FROM user WHERE 0);
INSERT INTO user_temp
SELECT p.* FROM user p WHERE p.id in ('list')
4) update user table with the no of pushes sent and the day it was sent.
UPDATE user p, user_temp p2 SET p.no_of_pushes= p2.no_of_pushes + 1, p.since_last_push=NOW() WHERE p.id = p2.id
5) Finally destroy the temporary table
DROP TABLE user_temp;
6) Fetch the next 50K records and continue the process
To make the process faster, avoid 2nd step and try not to process records one by one, instead process all 50K records in db level and push them to temporary table in one go. This will make the process much faster.
Temporary tables and batch wise query executions are the simple techniques to process millions of records with minimum delay when sending large number of push notifications to ios and android devices.
Lets take an example. Our server side app task is to send 500k push notifications to ios devices. When users receive the notifications, they might click on it, eventually they launch the application on their devices. Our client side app will give them a bonus of 5/10 coins to them when they launch the application on their device after getting the notifications, also bonus will be available only for 1 or 2 hours, bonus will have to expire after 1 or 2 hours. There is one more, The server side application must send push notifications on daily basis and only to users with certain conditions are met.
Lets get the key points of this server side and client side applications
1) Send 500k push notifications from server side.
2) Send push notifications on daily basis.
3) Give bonus to users when they click on the push notifications
4) Bonus expire after 1/2 hours.
We are not going to discuss how the client side app is made and how to send large number of push notifications. To know more about push notifications, visit http://ourjavaknowledge.blogspot.in/2013/06/push-notification.html
We focus only on the server side implementation to process huge number of database records.
Lets take USER table that holds a million records. To process these many records, fetch them in batch wise manner. Even if you have around 32 or 64GB RAM, its safe to process them in batch manner. If you have more memory, increase the batch size. But don't try to process millions of records in one go, if you have a very limited amount of memory.
Now, lets consider our batch size is 50K. We cannot fetch and process the first 50k records straight away from user table. Because requirement says that we have to send notifications only on daily basis and only to users with certain conditions are met. Lets put a constraint here that its impossible to write a query to fetch 50K records with those conditions. So we have to filter the users one by one. Fetch first 50K records and process records one by one and put their primary key id's in List( java is my back end language, so the preferred data structure to hold unlimited data is List)
1) Fetch first 50K records
SELECT * FROM user LIMIT 0, 50000
2) Insert all users primary key id's in to List after filtering records one by one
3) Fetch and insert all records in to TEMPORARY table.
CREATE TEMPORARY TABLE user_temp AS (SELECT * FROM user WHERE 0);
INSERT INTO user_temp
SELECT p.* FROM user p WHERE p.id in ('list')
4) update user table with the no of pushes sent and the day it was sent.
UPDATE user p, user_temp p2 SET p.no_of_pushes= p2.no_of_pushes + 1, p.since_last_push=NOW() WHERE p.id = p2.id
5) Finally destroy the temporary table
DROP TABLE user_temp;
6) Fetch the next 50K records and continue the process
To make the process faster, avoid 2nd step and try not to process records one by one, instead process all 50K records in db level and push them to temporary table in one go. This will make the process much faster.