PHP timestamp minus 24 hours

greg's picture

He has: 1,573 posts

Joined: Nov 2005

I have a bunch of timestamps stored in a table for a bunch of events - past, present and future.
I need to determine if timestamp for event is within 24 hours previous to now (and future).

So how do I make a timestamp for 24 hours ago?
Then I should be able to use > || == on the timestamps.

EDIT
Never mind..subtracting 86400 (seconds in a day) from the timestamp does it ... I think I need a break as that wasn't really difficult Doh!

Greg K's picture

He has: 2,113 posts

Joined: Nov 2003

Just to let you know, if you are needing it for a condition for data to be returned, you can do this in the SQL statement. It is best practice to do as much processing of the data the SQL server before getting data.

Here is the method I usually use, and there are many ways, so this is just my preference.

http://dev.mysql.com/doc/refman/5.1/en/date-and-time-functions.html#func...

-Greg

This space intentionally left blank...

greg's picture

He has: 1,573 posts

Joined: Nov 2005

Hmm I do need it for a condition for data from mysql, but cannot use it in the query here.

As the events can occur over multiple days, and therefore potentially 2 months & two years (Dec 28 to Jan 5), I have multiple timestamps for each event. The event has no true limit for days it could span to allow for a set total sql field allowance, so I enter them in one field with a delimiter, then splitting at the delimiter and the largest timestamp for the split is taken as the "latest".

Otherwise that may have been useful .

I don't like using one field for multiple stuff and then splitting by a char, but in this scenario there wasn't a better option really.

Cheers

Abhishek Reddy's picture

He has: 3,348 posts

Joined: Jul 2001

greg wrote:
As the events can occur over multiple days, and therefore potentially 2 months & two years (Dec 28 to Jan 5), I have multiple timestamps for each event. The event has no true limit for days it could span to allow for a set total sql field allowance, so I enter them in one field with a delimiter, then splitting at the delimiter and the largest timestamp for the split is taken as the "latest".

Otherwise that may have been useful .

I don't like using one field for multiple stuff and then splitting by a char, but in this scenario there wasn't a better option really.

Normalise it. Store timestamped occurrences in another table, associated to the events table by a foreign key.

greg's picture

He has: 1,573 posts

Joined: Nov 2005

Why would that be better?

Currently I have:
Data from one field in a table I already query, one field data gets exploded into an array, then a simple loop on the array to echo out all the values.

Versus
An additional table, more ID associations required between tables, another query in PHP for the new table with now multiple row queries and therefore an additional bunch of table searches, I would need an additional loop to store each row result in an array and still need to then loop the array to echo out values

I don't see at all how that would be more efficient.
Don't get me wrong, I am still picking up tips so if there's a reason please tell me.

Abhishek Reddy's picture

He has: 3,348 posts

Joined: Jul 2001

greg wrote:
Why would that be better?

Currently I have:
Data from one field in a table I already query, one field data gets exploded into an array, then a simple loop on the array to echo out all the values.

The fact that this thread exists is one symptom of the deep problem with this approach. You cannot, for instance, perform a conditional SQL query upon timestamp ranges. Rather defeats the purpose of using a relational database at all.

greg wrote:

An additional table, more ID associations required between tables, another query in PHP for the new table with now multiple row queries and therefore an additional bunch of table searches, I would need an additional loop to store each row result in an array and still need to then loop the array to echo out values

That's incorrect, you would not need an additional query in PHP. Use a SQL JOIN in a single query to select related records from the events and occurrences tables together. The result set will be contained in a single multidimensional array, as it is now, but with additional fields.

greg wrote:

I don't see at all how that would be more efficient.
Don't get me wrong, I am still picking up tips so if there's a reason please tell me.

Let's say we want to find every event which has occurred at least once in the last 24 hours. With your approach, we would have to select every event, parse its timestamps value in PHP, looping again to filter the results.

The efficiency costs include: spending memory allocated to PHP on copies of unnecessary data; computing string operations and type conversions on every record's timestamps; wasting time looping over all data again in PHP to select the final result.

Using a normalised schema, in direct comparison: will only spend memory allocated to the database once on (parts of) unneeded records; will apply conditional checks using optimised algorithms; will yield a result set ready to be consumed without excess processing by PHP.

greg's picture

He has: 1,573 posts

Joined: Nov 2005

Abhishek Reddy wrote:
You cannot, for instance, perform a conditional SQL query upon timestamp ranges.
But I don't need to as the events listings doesn't nor ever will require to do so.

Abhishek Reddy wrote:
That's incorrect, you would not need an additional query in PHP. Use a SQL JOIN in a single query
Ok, but I still need:
An additional table, more ID associations required between tables, an additional table to query, an additional table to insert and maintain

Abhishek Reddy wrote:
Let's say we want to find every event which has occurred at least once in the last 24 hours. With your approach, we would have to select every event, parse its timestamps value in PHP, looping again to filter the results.
When I list an event the description, links etc are all wrapped in one area (div) for that one event with the one title. Each date and time for that event are listed under that one event.
So when I query the events for everything from 24 hours ago, it is simply from the last date the event runs. This is to allow for people to see events not yet taken place.
Once the event date is beyond 24 hours, it goes into archived page (which simply list all where timestamp is smaller than timestamp from 24 hours).

I have a field that has the last date of each event.
When the DB is inserted with the range of timestamps from the input array, the last one is added the table field "last_date"

$var = the timestamp from 24 hours ago (which was the point of this thread!)
WHERE last_date > $var
So if the last date is from 24 hours ago and beyond, get data.

With my approach, that's all done within one table.
Table is queried to find where "last_date" is greater than timestamp from 24 hours ago, and if returns true, then all the data I need is right there in that very row.

Whereas with your method (for this particular circumstance) the additional requirement of searching another table is a bad thing.
As then it's running through an entire table to see where dates are greater than timestamp from 24 hours ago, then it has to get the association Id, go to another table, find the row that matches that ID and fetch all the data from that row.

As always best practice for PHP and DB structures is always down to the specific requirements. Also, future proofing for things that wont ever be required is potentially additional tables/fields or PHP that might not ever be used.

So with my scenario I don't think adding a new table with associations to other tables and having to scan through TWO tables instead of one is more efficient.

I do however see your logic and agree that in other scenarios, your method would be useful, and more efficient even with that additional table.

Abhishek Reddy's picture

He has: 3,348 posts

Joined: Jul 2001

greg wrote:
But I don't need to as the events listings doesn't nor ever will require to do so.

Are you sure you need a relational database at all?

greg wrote:
Ok, but I still need:
An additional table, more ID associations required between tables, an additional table to query, an additional table to insert and maintain

There is nothing inherently wrong with this.

greg wrote:
I have a field that has the last date of each event.
When the DB is inserted with the range of timestamps from the input array, the last one is added the table field "last_date"

$var = the timestamp from 24 hours ago (which was the point of this thread!)
WHERE last_date > $var
So if the last date is from 24 hours ago and beyond, get data.

With my approach, that's all done within one table.
Table is queried to find where "last_date" is greater than timestamp from 24 hours ago, and if returns true, then all the data I need is right there in that very row.

This is a large amount of complexity for a very small optimisation in a single case. This approach completely discounts the future of the application (requirements will change), and defeats the purpose of using a relational database (better to use an optimised object store).

greg wrote:
Whereas with your method (for this particular circumstance) the additional requirement of searching another table is a bad thing.
As then it's running through an entire table to see where dates are greater than timestamp from 24 hours ago, then it has to get the association Id, go to another table, find the row that matches that ID and fetch all the data from that row.

This is not necessarily a large cost; and even if you wanted to optimise for your isolated case, you could still use your technique of caching the last date in the events table. SQL JOINs are also deeply optimised in most RDMS systems; I suggest you try it and benchmark the results to see if it's really as bad as you imagine.

greg wrote:
As always best practice for PHP and DB structures is always down to the specific requirements. Also, future proofing for things that wont ever be required is potentially additional tables/fields or PHP that might not ever be used.

Your aversion to tables is baffling, especially here when one additional table will eschew some ad hoc string-based storage of a list of timestamps that complicates work on the client side and wastes the DBMS. The robustness and flexibility your application gains by normalising this will probably be worth more in the long-term than the apparent (and unproven) performance loss you expect.

This is really a basic idea in relational database design. If you remain unconvinced, I suppose only the pain of having to extend a poorly normalised schema will persuade you otherwise. I would still recommend reading about database normalisation and then reconsidering your approach.

Good luck.

greg's picture

He has: 1,573 posts

Joined: Nov 2005

I take on board your comments and advice, and thanks for taking the time to respond.

I just felt an additional table and mysql searching wouldn't be as efficient as small amount of additional PHP (explode that field).
Especially as any table caching will change (new rows inserted), where as the PHP doesn't change.
I also understand the future proofing could be useful in most other scenarios.

Mysql optimisation is definitely something I need to bury my head in. And performance tweaking of PHP versus mysql etc.

Cheers

Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.