Estimated Time:2 minutes

FHIR Search Engine - Resource Operators

Technical Consideration

Technical Conclusion


It is possible from spring-data-solr to build the solr query. This query can be passed to spark-solr.

fhir-syntax -> solr-syntax

AND(Dataframe, Dataframe, Column) OR(Dataframe, Dataframe, Column) EXCEPT(Dataframe, Dataframe) AFTER(Dataframe, Dataframe, column, column, column) OCCURRENCE(Dataframe, number) NOT_IN(Dataframe, Dataframe)

DefaultQueryParser b = new DefaultQueryParser(new SimpleSolrMappingContext());
String z = b.createQueryStringFromNode(search.getCriteria(), null);

Things go like this:

fhir-client: fhir-syntax-operators -> fhir-backend: fhir-syntax-operator => solr-syntax-operator -> spark-backend: parser,joins, results -> create a patient list within solr

Group versus List

Results Informations

Most relevant FHIR ressources have two, three or four of those elements:

  1. id
  2. patient
  3. encounter
  4. date

Patient Level

Every filtered ressource being linked to a patient it is possible to operate on them all:

val r0 = (1::2::3::6::7::Nil).toDF("patient")
val r1 = (1::2::3::Nil).toDF("patient")
val r2 = (2::3::6::Nil).toDF("patient")
val r3 = (1::3::7::Nil).toDF("patient")

Encounter Level

val r1 = ((1,10)::(1,11)::(3,12)::Nil).toDF("patient", "encounter")
val r2 = ((1,11)::(3,12)::Nil).toDF("patient", "encounter")
val r3 = (1::3::7::Nil).toDF("patient")

Date Level

val r1 = ((1,new Date(1L))::(1,new Date(2L))::Nil).toDF("patient", "date")
val r2 = ((1,new Date(1L))::(3,new Date(2L))::Nil).toDF("patient", "date")

All Together

Number Of Occurrence

This page was last modified: