I think this is pretty easy to build, but not sure how well it will scale, and therefore if it will actually work.
I am trying to attach an indexer to a request so that as a search query gets requested, it is more likely to be up to have a deeper data set (easier to see than describe, but it isn't there yet).
In a nutshell, I don't have the resources to go and scour a ton of data, but i've grabbed some starting points, and as the site gets used, those starting points direct what data should be indexed deeper.
What I'm trying to avoid is having 3 people request the same query/data to update at the same time, and have the indexer go and run and update that query's results.
Because they would be rewriting the same data to the same database at the same time.
I don't think the mysql UPDATE FOR LOCK (or something like that) will work because you can still make a request to the locked line, it just can't be updated is my understanding.
the query fields are already populated with unique id's so I can't use mysql_query_id( ), as I think that is only for INSERT statements, not UPDATE.
So, what I'm thinking is two mysql calls one after the other.
so I think this will work with low traffic but any idea if high traffic will cause it to break?
Unfortunately for me, as traffic increases, I am exponentially increasing the server load, but I may seperate my read and writes to seperate tables, and then schedule to merge the tables on a regular basis.
Anybody ideas of another way of doing this?
I am trying to attach an indexer to a request so that as a search query gets requested, it is more likely to be up to have a deeper data set (easier to see than describe, but it isn't there yet).
In a nutshell, I don't have the resources to go and scour a ton of data, but i've grabbed some starting points, and as the site gets used, those starting points direct what data should be indexed deeper.
What I'm trying to avoid is having 3 people request the same query/data to update at the same time, and have the indexer go and run and update that query's results.
Because they would be rewriting the same data to the same database at the same time.
I don't think the mysql UPDATE FOR LOCK (or something like that) will work because you can still make a request to the locked line, it just can't be updated is my understanding.
the query fields are already populated with unique id's so I can't use mysql_query_id( ), as I think that is only for INSERT statements, not UPDATE.
So, what I'm thinking is two mysql calls one after the other.
Code:
$gettingIndex = "SELECT id FROM queries WHERE look_up = '0' LIMIT 0, 1 ; UPDATE queries SET look_up = '1' WHERE id = '0' LIMIT 1;" mysql_query($gettingIndex);
Unfortunately for me, as traffic increases, I am exponentially increasing the server load, but I may seperate my read and writes to seperate tables, and then schedule to merge the tables on a regular basis.
Anybody ideas of another way of doing this?
Comment