Mongoose aggregation pipeline - converting db ISOdate ==>> UNIX timestamp? (version 3.6 not 4.0) as part of $project or $group stage - node.js

There are hundreds of threads and operators to get date objects into ISO strings but can't find any resources on doing it the other way around as part of the mongoose/mongodb aggregation operators in Node.
I have a legacy mongodb 3.6 that's currently being used in production and I have a pipeline where I'm trying to convert a normal ISO date object from mongo into an UNIX timestamp so I can use it together with ngx-charts and other charts.
I can't use the $toDate, $convert operator or $dateFromString as the options that I need are not available in 3.6
So far I have tried variations of this:
$project: {
_id: 0,
// name: '$_id._id',
value: '$_id.count',
name: new Date.parse('$_id.date').getTime(),
min: '$min',
max: '$max'
}
but none of that worked as the aggregation is processed on the db and has no idea what that function is. I've looked at many operators and tried to convert to string first and then back to UNIX date but it doesn't seem to have anything available to convert into javascript/unix timestamp from that ISO date.

Related

How to map a Firestore date object to a date in elasticsearch

I am using a cloud function to send a Firebase firestore document to elasticsearch for indexing. I am trying to find a way to map a firebase timestamp field to an elasticsearch date field in the index.
The elasticsearch date type mapping supports formats for epoch_millis and epoch_seconds but the firestore date type is an object as follows:
"timestamp": {
"_seconds": 1551833330,
"_nanoseconds": 300000000
},
I could use use the seconds field but will lose the fractional part of the second.
Is there a way map the timestamp object to a date field in the index that calculates the epoch_millis from the _seconds and _nanoseconds fields? I recognize that precision will be lost (nanos to millis).

Neo4j cypher: set cutofftime (datetime) property of a node

Need to set the datetime property of a node. Need to create a node Work:
CREATE (w:Work {type: "tech", mode: "E", cut_off_time: "12:10:00"})
There is a way in which I can define the hours, minutes and year as different properties:
{hours: 12, minutes: 10, seconds: 00},
but how can I set it as a single property. How can I define it as a datetime object in neo4j, or is it just considered as a string.
Neo4j doesn't have a datetime data type.
So you can store it :
as a string, like you do
in multiple fields (like you suggested)
with a time-tree like, ie as nodes (https://neo4j.com/blog/modeling-a-multilevel-index-in-neoj4/)
as a long into a property (ex: time in minutes)
Choice is yours, but it depends of what kind of queries you would like to do.
Cheers

Postgres json type inner Query [duplicate]

I am looking for some docs and/or examples for the new JSON functions in PostgreSQL 9.2.
Specifically, given a series of JSON records:
[
{name: "Toby", occupation: "Software Engineer"},
{name: "Zaphod", occupation: "Galactic President"}
]
How would I write the SQL to find a record by name?
In vanilla SQL:
SELECT * from json_data WHERE "name" = "Toby"
The official dev manual is quite sparse:
http://www.postgresql.org/docs/devel/static/datatype-json.html
http://www.postgresql.org/docs/devel/static/functions-json.html
Update I
I've put together a gist detailing what is currently possible with PostgreSQL 9.2.
Using some custom functions, it is possible to do things like:
SELECT id, json_string(data,'name') FROM things
WHERE json_string(data,'name') LIKE 'G%';
Update II
I've now moved my JSON functions into their own project:
PostSQL - a set of functions for transforming PostgreSQL and PL/v8 into a totally awesome JSON document store
Postgres 9.2
I quote Andrew Dunstan on the pgsql-hackers list:
At some stage there will possibly be some json-processing (as opposed
to json-producing) functions, but not in 9.2.
Doesn't prevent him from providing an example implementation in PLV8 that should solve your problem.
Postgres 9.3
Offers an arsenal of new functions and operators to add "json-processing".
The manual on new JSON functionality.
The Postgres Wiki on new features in pg 9.3.
#Will posted a link to a blog demonstrating the new operators in a comments below.
The answer to the original question in Postgres 9.3:
SELECT *
FROM json_array_elements(
'[{"name": "Toby", "occupation": "Software Engineer"},
{"name": "Zaphod", "occupation": "Galactic President"} ]'
) AS elem
WHERE elem->>'name' = 'Toby';
Advanced example:
Query combinations with nested array of records in JSON datatype
For bigger tables you may want to add an expression index to increase performance:
Index for finding an element in a JSON array
Postgres 9.4
Adds jsonb (b for "binary", values are stored as native Postgres types) and yet more functionality for both types. In addition to expression indexes mentioned above, jsonb also supports GIN, btree and hash indexes, GIN being the most potent of these.
The manual on json and jsonb data types and functions.
The Postgres Wiki on JSONB in pg 9.4
The manual goes as far as suggesting:
In general, most applications should prefer to store JSON data as
jsonb, unless there are quite specialized needs, such as legacy
assumptions about ordering of object keys.
Bold emphasis mine.
Performance benefits from general improvements to GIN indexes.
Postgres 9.5
Complete jsonb functions and operators. Add more functions to manipulate jsonb in place and for display.
Major good news in the release notes of Postgres 9.5.
With Postgres 9.3+, just use the -> operator. For example,
SELECT data->'images'->'thumbnail'->'url' AS thumb FROM instagram;
see http://clarkdave.net/2013/06/what-can-you-do-with-postgresql-and-json/ for some nice examples and a tutorial.
With postgres 9.3 use -> for object access. 4 example
seed.rb
se = SmartElement.new
se.data =
{
params:
[
{
type: 1,
code: 1,
value: 2012,
description: 'year of producction'
},
{
type: 1,
code: 2,
value: 30,
description: 'length'
}
]
}
se.save
rails c
SELECT data->'params'->0 as data FROM smart_elements;
returns
data
----------------------------------------------------------------------
{"type":1,"code":1,"value":2012,"description":"year of producction"}
(1 row)
You can continue nesting
SELECT data->'params'->0->'type' as data FROM smart_elements;
return
data
------
1
(1 row)

Javascript momentjs convert UTC from string to Date Object

Folks,
Having a difficult time with moment.js documentation.
record.lastModified = moment.utc().format();
returns:
2014-11-11T21:29:05+00:00
Which is Great, its in UTC... When I store that in Mongo, it gets stored as a String, not a Date object type, which is what i want.
What I need it to be is:
"lastModified" : ISODate("2014-11-11T15:26:42.965-0500")
But I need it to be a native javascript object type, and store that in Mongo. Right now if i store the above, it goes in as string, not Date object type.
I have tried almost everything with moment.js. Their toDate() function works, but falls back to my local timezone, and not giving me utc.
Thanks!
Saving a Javascript Date object will result in an ISODate being stored in Mongo.
Saving an ISO date as a Javascript String will result in a String being stored in Mongo.
So, this is what you want: record.lastModified = new Date(moment().format());
Not an ideal solution, But I achieved the same result, by manually converting it to ISODate object through monogo shell. We need the ISODate for comparison/query for aggregating results, so we run this script before running our aggregate scripts.
Inserting local time string by using moment().format().
"createdDt" : "2015-01-07T17:07:43-05:00"`
Converting to an ISODate (UTC) with:
var cursor = db.collection.find({createdDt : {$exists : true}});
while (cursor.hasNext()){
var doc = cursor.next();
db.collection.update(
{_id : doc._id},
{$set: {createdDt : new ISODate(doc.createdDt)}})
}
results in
"createdDt" : ISODate("2015-01-07T22:07:43Z")"
Note the time got converted
T17:07:43-05:00 to T22:07:43Z
I could not find any solution for inserting BSON ISODate format (which is UTC by default) from JavaScript directly, while inserting a new document, it seems to be available through pyMongo & C#/Java Drivers though. Trying to look for an maintainable solution

How do you query a ravendb index containing dates using Lucene?

I am using the http api to query ravendb (so a LINQ query is not the solution to my question).
My Product document looks like this:
{
"editDate": "2012-08-29T15:00:00.846Z"
}
and I have the index:
from doc in docs.Product
select new { doc.editDate }
I want to query all documents before a certain date AND time. I can query on the DATE using this syntax:
editDate: [NULL TO 2012-09-17]
however I can't figure out how to query the time component as well.
Any ideas?
You can query that using:
editDate: [NULL TO 2012-09-17T15:00:00.846Z]
If you care for a part of that, use:
editDate: [NULL TO 2012-09-17T15:00]
Note that you might have to escape parts of the query, like so:
editDate: [NULL TO 2012\-09\-17T15\:00]
For this to work you also need to ensure that the field is analysed.
In Raven Studio - Add Field -> editDate, and set Indexing to Analyzed.
One way of working around this problem is to store the value not as a date but as the number of seconds since the unix epoch.
This results in a number that you can compare against with ease.
I am using javascript and the moment js library:
{
"editDate": "2012-08-29T15:00:00.846Z",
"editDateUnix": moment("2012-08-29T15:00:00.846Z").unix()
}
Any my index:
from doc in docs.Product
select new { doc.editDate, doc.editDateUnix }
and my lucene query:
"editDate: [NULL TO "+moment("2012-09-17").unix()+"]"

Resources