diff --git a/README.md b/README.md
index 8a13d23b6137f8d0b0d1361d59a5145d18d4a6c4..8521b686addcaffb323eadf81d97fabd01cba0c7 100644
--- a/README.md
+++ b/README.md
@@ -1,17 +1,15 @@
 # postgrestutils
 
-A very basic POSTGREST client and utils
+A very basic PostgREST client and utils
 
 ## Usage
 
-### Setup
-
-#### Django
-  - add `"postgrestutils"` to your `INSTALLED_APPS` setting
-  - add `POSTGREST_UTILS_BASE_URI` (should default to the most frequently used POSTGREST instance in the future) and `POSTGREST_UTILS_JWT` to your project settings
+The client is intended to be used by utilizing the `postgrestutils.Session` as a context manager.
+In fact it will not work if used otherwise because the underlying `requests.Session` is created only when it's used as a context manager.
 
 ```python
-from postgrestutils.client import pgrest_client
+import postgrestutils
+
 
 params = {
     "select": "id,forename",
@@ -19,21 +17,44 @@ params = {
 }
 
 # this will send a request to 'POSTGREST_UTILS_BASE_URI/kerbals?select=id,forename&forename=eq.Jebediah'
-res = pgrest_client.filter("kerbals", params=params)
+with postgrestutils.Session() as s:
+    res = s.filter('kerbal', params=params)
 ```
 
-#### Other projects
+By default constructing a new `postgrestutils.Session` will take the settings discussed in [setup](#setup) into account.
+Hence there is no need to specify `base_uri` or `token` explicitly unless you are using more than one API or database role in your project.
+
+Additionally `postgrestutils.Session` takes `schema: Optional[str] = None`, `parse_dt: bool = True` and `count: Count = Count.NONE` (some of which are explained later on).
+These options are session defaults and may be overridden on a per-request basis, e.g.
 
 ```python
-from postgrestutils.client import pgrest_client
-pgrest_client.configure('your-JWT', base_uri='http://127.0.0.1:3000')
+import postgrestutils
 
-params = {
-    "select": "id,forename"
-}
-res = pgrest_client.filter("kerbals", params=params)
+
+with postgrestutils.Session(parse_dt=False) as s:
+    print(s.get('random_datetime'))  # makes request using parse_dt=False
+    print(s.get('random_datetime', parse_dt=True))  # makes request using parse_dt=True
+    print(s.get('random_datetime'))  # makes request using parse_dt=False
 ```
 
+### Setup
+
+Settings are either configured as django settings when `postgrestutils` is used in a django project or as environment variables otherwise.
+Django projects need to additionally add `'postgrestutils'` to the `INSTALLED_APPS` setting.
+
+#### Shared settings
+
+| Setting                   | Description
+|---------------------------|---------------------------------------------------------------
+|`POSTGREST_UTILS_BASE_URI` | base uri of the PostgREST instance to use
+|`POSTGREST_UTILS_JWT`      | JWT for the corresponding database role and PostgREST instance
+
+#### Django-only settings
+
+| Setting                   | Description
+|---------------------------|---------------------------------------------------------------
+|`POSTGREST_UTILS_AUTOFETCH`| account columns to fetch on login (comma-separated str)
+
 ### Making requests
 
 `postgrestutils` tries to be as intuitive and pythonic as possible while also being efficient.
@@ -45,12 +66,12 @@ There are however a few differences that will be explained in detail below.
 
 #### Lazy evaluation
 
-Akin to django querysets `postgrestutils` has a `LazyPostgrestJsonResult` that is returned from calls to `pgrest_client.filter()` without making any API calls yet.
+Akin to django querysets `postgrestutils` has a `JsonResultSet` that is returned from calls to `.filter()` without making any API calls yet.
 If you're familiar with django's rules for evaluation this list won't suprise you.
-Since there are a few subtle differences however here is what will cause evaluation of a `LazyPostgrestJsonResult`:
+Since there are a few subtle differences however here is what will cause evaluation of a `JsonResultSet`:
 
 - Iteration.
-A `LazyPostgrestJsonResult` is iterable and will fetch all elements from the API the first time you iterate over it.
+A `JsonResultSet` is iterable and will fetch all elements from the API the first time you iterate over it.
 - Slicing.
 This will fetch the elements in the specified range.
 - `repr()`.
@@ -59,10 +80,10 @@ As a convenience implementation for interactive interpreter sessions this will f
 Unsurprisingly this returns the count of the requested table.
 Depending on the [counting strategy](#counting-strategies) this has different implications such as the cache being populated.
 - `list()`.
-This can be useful to force evaluation of a `LazyPostgrestJsonResult`.
+This can be useful to force evaluation of a `JsonResultSet`.
 - `bool()`.
-Using a `LazyPostgrestJsonResult` in any boolean context will evaluate it.
-- Using the `.get()` method on `pgrest_client`.
+Using a `JsonResultSet` in any boolean context will evaluate it.
+- Using the `.get()` method on a session.
 Getting some lazy object when explicitly requesting a single element doesn't make much sense.
 Like django's `Model.objects.get()` this will return the requested element or raise a `ObjectDoesNotExist`/`MultipleObjectsReturned` if none or multiple objects were found.
 
@@ -74,9 +95,10 @@ Great, you already know how pagination works.
 Just to be sure here is a snippet of pagination in action:
 
 ```python
->>> business_roles = pgrest_client.filter('business_role')
->>> business_roles[:3]  # fetches the first 3 business roles
->>> business_roles[3:6]  # fetches the next 3 business roles
+>>> with postgrestutils.Session() as s:
+...     business_roles = s.filter('business_role')
+...     business_roles[:3]  # fetches the first 3 business roles
+...     business_roles[3:6]  # fetches the next 3 business_roles
 ```
 
 #### Caching
@@ -86,33 +108,36 @@ Here's a short snippet explaining the gist of it:
 
 ```python
 # Bad: Fetches the same data from the API twice
->>> print([role['id'] for role in pgrest_client.filter('business_role')])
->>> print([role['name'] for role in pgrest_client.filter('business_role')])
+>>> with postgrestutils.Session() as s:
+...     print([role['id'] for role in s.filter('business_role')])
+...     print([role['name'] for role in s.filter('business_role')])
 
 # Good: Uses the cache resulting in only a single API request
->>> business_roles = pgrest_client.filter('business_role')
->>> print([role['id'] for role in business_roles])  # fetches all elements into the cache
->>> print([role['name'] for role in business_roles])  # re-uses the cached elements
+>>> with postgrestutils.Session() as s:
+...     print([role['id'] for role in s.filter('business_role')])  # fetches all elements into the cache
+...     print([role['name'] for role in s.filter('business_role')])  # re-uses the cached elements
 ```
 
 ##### When results are not cached
 
-There are a few cases where a `LazyPostgrestJsonResult` will not cache results:
+There are a few cases where a `JsonResultSet` will not cache results:
 
 - Indexing and slicing.
 If the cache is not yet populated indexing and slicing - even on the same index/ranges - will result in an API call.
 
 ```python
 # without populated cache
->>> business_roles = pgrest_client.filter('business_role')
->>> business_roles[5]  # fetches the 6th element from the API
->>> business_roles[5]  # fetches the 6th element from the API again
+>>> with postgrestutils.Session() as s:
+...     business_roles = s.filter('business_role')
+...     business_roles[5]  # fetches the 6th element from the API
+...     business_roles[5]  # fetches the 6th element from the API again
 
 # with populated cache
->>> business_roles = pgrest_client.filter('business_role')
->>> list(business_roles)  # fetches all elements from the API
->>> business_roles[5]  # re-uses the cached elements
->>> business_roles[5]  # re-uses the cached elements
+>>> with postgrestutils.Session() as s:
+...     business_roles = s.filter('business_role')
+...     list(business_roles)  # fetches all elements from the API
+...     business_roles[5]  # re-uses the cached elements
+...     business_roles[5]  # re-uses the cached elements
 ```
 
 - `repr()`.
@@ -129,7 +154,7 @@ Any subsequent calls will re-use the cache instead of making any API calls.
 
 ##### Invalidating the cache
 
-If you have a `LazyPostgrestJsonResult` around that you want to re-use but need up-to-date data simply call the `.refresh_from_pgrest()` method on it.
+If you have a `JsonResultSet` around that you want to re-use but need up-to-date data simply call the `.refresh_from_pgrest()` method on it.
 That will lazily refresh data from PostgREST by invalidating the cache.
 Your object will now behave as if you just created it.
 
@@ -141,9 +166,9 @@ PostgREST currently offers two [counting strategies](http://postgrest.org/en/sta
 ##### Using `count=Count.NONE`
 
 If you don't need to know the count for your request this is obviously a good counting strategy to choose.
-But what happens if you need the count and just call `len()` on your `LazyPostgrestJsonResult` anyway?
+But what happens if you need the count and just call `len()` on your `JsonResultSet` anyway?
 This is again similar to what django querysets do.
-It will evaluate the `LazyPostgrestJsonResult` fetching all elements from the API into the cache and return the length of the cache.
+It will evaluate the `JsonResultSet` fetching all elements from the API into the cache and return the length of the cache.
 
 ##### Using `count=Count.EXACT`
 
@@ -155,7 +180,7 @@ To do so you need the count of all elements and the first few elements depending
 What you don't need however is all elements so why fetch them?
 
 This counting strategy allows you to get the count without fetching all elements.
-So what happens when calling `len()` on your `LazyPostgrestJsonResult` this time?
+So what happens when calling `len()` on your `JsonResultSet` this time?
 `postgrestutils` will explicitly request the count for your request which will be cheaper for large tables.
 
 Be careful with this for very large tables however as this can take a very long time as explained in the [PostgREST documentation](http://postgrest.org/en/stable/admin.html#count-header-dos).
@@ -164,6 +189,23 @@ As also mentioned there future versions will support estimating the count.
 #### Filtering
 http://postgrest.org/en/stable/api.html
 
+#### Schema switching
+
+As of v7.0.0 PostgREST has added schema switching support.
+By setting `db-schema` to a comma-separated string multiple schemas may be specified with the first schema being the default schema.
+Assuming your PostgREST instance has `db-schema = 'foo, bar'` in its configuration:
+
+```python
+import postgrestutils
+
+
+with postgrestutils.Session(schema='bar') as s:
+    s.filter(…)  # uses the 'bar' schema
+    s.filter(…, schema=postgrestutils.DEFAULT_SCHEMA)  # uses the default schema ('foo')
+    s.filter(…, schema='foo')  # explicitly request the 'foo' schema
+    s.filter(…)  # uses the 'bar' schema
+```
+
 ### Django helpers
 
 #### custom `user_account_fetched` signal
@@ -197,13 +239,16 @@ Your callback function could look something like this:
 
 ```python
 # utils.py
-from postgrestutils.client import pgrest_client
+import postgrestutils
 
 
 def your_callback_func(sender, **kwargs):
     request = kwargs['request']
     account = kwargs['account']
-    # fetching some addtional data your project needs frequently using the pgrest_client (person, memberships etc.)
+    # fetch some addtional data your project needs frequently (person, memberships etc.)
+    with postgrestutils.Session() as s:
+        person = s.get(…)
+        memberships = list(s.filter(…))
 
     # 'caching' that data in the session
     request.session['account'] = account
@@ -213,8 +258,33 @@ def your_callback_func(sender, **kwargs):
 
 For more information on signals refer to the django docs. They are great. Really.
 
+### Tips and Tricks
+
+#### `functools.partial`
+
+When using different APIs or a common configuration for the `postgrestutils.Session` it might be annoying to repeat the same `kwargs` all the time.
+`functools.partial` can be used to pre-configure a common configuration in a suitable place of your project:
+
+```python
+import functools
+import postgrestutils
+
+
+# Configure a foo_session for a different API your project uses
+foo_session = functools.partial(postgrestutils.Session, base_uri='https://another-api.com/', token='secret-token')
+
+# using the main API
+with postgrestutils.Session() as s:
+    objs = list(s.filter(…))
+
+# using the other API
+with foo_session() as s:
+    foo_objs = list(s.filter(…))
+```
+
 ### Testing
 
 `postgrestutils` has a bunch of unittests because manually testing it has become quite time-consuming.
 The tests aim to ensure functional correctness as well as some performance related concerns i.e. caching and lazyness.
-With `requests-mock` installed, running is as easy as `python -m unittest`.
+It may also be a good place to look at when trying to understand the guarantees `postgrestutils` makes.
+With `requests-mock` installed, running all tests is as easy as `python -m unittest`.
diff --git a/postgrestutils/__init__.py b/postgrestutils/__init__.py
index 68487c33f9a9e78e17559547e1ad54539e3dceb6..f08987a5541f0634b416c8aa4592747269d6abe1 100644
--- a/postgrestutils/__init__.py
+++ b/postgrestutils/__init__.py
@@ -1,9 +1,336 @@
-import logging
+import copy
+import enum
+import re
+from typing import Optional
+from urllib.parse import urljoin
 
-from requests import HTTPError
+import requests
+from requests import HTTPError  # re-export to allow for exception handling
 
 from . import app_settings
-
-logger = logging.getLogger("postgrestutils")
+from .utils import datetime_parser, logger
 
 default_app_config = "postgrestutils.apps.PostgrestUtilsConfig"
+
+
+REPR_OUTPUT_SIZE = 20
+
+Count = enum.Enum('Count', (('NONE', None), ('EXACT', 'exact')))
+
+DEFAULT_SCHEMA = object()
+
+
+class ObjectDoesNotExist(Exception):
+    pass
+
+
+class MultipleObjectsReturned(Exception):
+    pass
+
+
+class Session:
+    """
+    A PostgREST session to be used as a context manager.
+    The session will default to the options specified on creation. However,
+    some of these may be overridden on a per-request basis by using the `.get()`
+    or `.filter()` methods.
+    """
+    def __init__(
+        self,
+        base_uri: Optional[str] = None,
+        token: Optional[str] = None,
+        schema: Optional[str] = None,
+        parse_dt: bool = True,
+        count: Count = Count.NONE
+    ):
+        """
+        :param base_uri: base uri of the PostgREST instance to use
+        :param token: JWT for the corresponding database role and PostgREST
+        instance
+        :param schema: the database schema to use
+        :param parse_dt: whether to parse datetime strings as returned by
+        PostgREST to python datetime objects
+        :param count: counting strategy as explained in the README
+        """
+        self.session = None
+        self.base_uri = base_uri or app_settings.BASE_URI
+        self.token = token or app_settings.JWT
+        self.schema = schema
+        self.parse_dt = parse_dt
+        self.count = count
+
+    def __enter__(self):
+        self.session = requests.Session()
+        self._configure_session_defaults()
+        return self
+
+    def __exit__(self, exc_type, exc_value, traceback):
+        self.session.close()
+
+    def get(
+        self,
+        endpoint: str,
+        parse_dt: Optional[bool] = None,
+        count: Optional[Count] = None,
+        schema: Optional[str] = None,
+        **kwargs
+    ) -> dict:
+        """
+        Get a single object from the specified endpoint. This will most likely
+        require specifying params so that only a single object is found.
+        :param endpoint: specifies which endpoint to request
+        when multiple results are returned)
+        :param parse_dt: whether to parse datetime strings as returned by
+        PostgREST to python datetime objects
+        :param count: counting strategy as explained in the README
+        :param kwargs: pass kwargs directly to requests's `.get()` method
+        :return: single element as dict
+        :raises: `ObjectDoesNotExist`/`MultipleObjectsReturned` if no or more
+        than one object was found
+        """
+        if schema is not None:
+            self._set_schema_header(schema, kwargs)
+        res = JsonResultSet(
+            self,
+            endpoint,
+            True,
+            parse_dt if parse_dt is not None else self.parse_dt,
+            count if count is not None else self.count,
+            **kwargs
+        )
+        # populate the cache
+        # will raise ObjectDoesNotExist/MultipleObjectsReturned if no or
+        # multiple elements are returned
+        res._fetch_all()
+        assert res._result_cache is not None
+        return res._result_cache[0]
+
+    def filter(
+        self,
+        endpoint: str,
+        parse_dt: Optional[bool] = None,
+        count: Optional[Count] = None,
+        schema: Optional[str] = None,
+        **kwargs
+    ) -> 'JsonResultSet':
+        """
+        :param endpoint: specifies which endpoint to request
+        :param parse_dt: whether to parse datetime strings as returned by
+        PostgREST to python datetime objects
+        :param count: counting strategy as explained in the README
+        :param kwargs: pass kwargs directly to request's `.get()` method
+        :return: `JsonResultSet`, a lazy python object
+        """
+        if schema is not None:
+            self._set_schema_header(schema, kwargs)
+        return JsonResultSet(
+            self,
+            endpoint,
+            False,
+            parse_dt if parse_dt is not None else self.parse_dt,
+            count if count is not None else self.count,
+            **kwargs
+        )
+
+    def _configure_session_defaults(self):
+        self.session.headers['Accept'] = 'application/json'
+        if self.token:
+            self.session.headers['Authorization'] = 'Bearer {}'.format(self.token)
+        if self.schema is not None:
+            self.session.headers['Accept-Profile'] = self.schema
+
+    def _set_schema_header(self, schema, kwargs: dict):
+        if schema is DEFAULT_SCHEMA:
+            schema = None
+        kwargs.setdefault('headers', dict())['Accept-Profile'] = schema
+
+
+class JsonResultSet:
+    """
+    A lazy python object that is returned when calling the `.filter()` method
+    on a `postgrestutils.Session`.
+    It behaves akin to django's `Queryset` and implements some dunder methods
+    to ensure pythonic behavior.
+    Check the README for more detailed information.
+    """
+    def __init__(self, client: Session, endpoint: str, singular: bool, parse_dt: bool, count: Count, **kwargs):
+        self._len_cache = None  # type: Optional[int]
+        self._result_cache = None  # type: Optional[list]
+
+        self.client = client  # type: Session
+        self.endpoint = endpoint  # type: str
+        self.singular = singular  # type: bool
+        self.parse_dt = parse_dt  # type: bool
+        self.count = count  # type: Count
+        self.request_kwargs = kwargs
+
+    def __repr__(self):
+        data = list(self[:REPR_OUTPUT_SIZE + 1])
+        if len(data) > REPR_OUTPUT_SIZE:
+            data[-1] = "...(remaining elements truncated)..."
+        return '<{} {}>'.format(self.__class__.__name__, data)
+
+    def __iter__(self):
+        self._fetch_all()
+        return iter(self._result_cache)
+
+    def __len__(self):
+        """
+        NOTE: Since singular requests (using `.get()`) return a python dict
+        rather than a `JsonResultSet`, `self.singular` should be ignored here.
+        """
+        if self.count != Count.NONE:
+            self._fetch_len()
+        else:
+            self._fetch_all()
+        return self._len_cache
+
+    def __getitem__(self, key):
+        """
+        NOTE: Since singular requests (using `.get()`) return a python dict
+        rather than a `JsonResultSet`, `self.singular` should be ignored here.
+        """
+        if not isinstance(key, (int, slice)):
+            raise TypeError(
+                "{self.__class__.__name__} indices must be integers or slices, not {key.__class__.__name__}".format(
+                    self=self,
+                    key=key
+                )
+            )
+        if ((isinstance(key, int) and key < 0) or
+           (isinstance(key, slice) and ((key.start is not None and key.start < 0) or
+                                        (key.stop is not None and key.stop < 0)))):
+            raise ValueError("{self.__class__.__name__} does not support negative indexing".format(self=self))
+        if isinstance(key, slice) and key.step is not None:
+            raise ValueError("{self.__class__.__name__} does not support stepping".format(self=self))
+
+        # cache is not populated and unbounded slice is requested, i.e. res[:]
+        if isinstance(key, slice) and all(e is None for e in (self._result_cache, key.start, key.stop)):
+            self._fetch_all()
+
+        if self._result_cache is not None:
+            return self._result_cache[key]
+
+        if isinstance(key, slice):
+            start = key.start if key.start is not None else 0
+            if key.stop is not None and key.stop <= start:
+                return list()
+            range = '{start}-{stop}'.format(start=start, stop=key.stop - 1 if key.stop is not None else '')
+            return self._fetch_range(range)
+        return self._fetch_range('{0}-{0}'.format(key))[0]  # single element requested, return dict
+
+    def refresh_from_pgrest(self):
+        """Lazily refresh data from PostgREST."""
+        self._result_cache = None
+        self._len_cache = None
+
+    def _fetch_len(self):
+        """
+        Fetch the length from the PostgREST API.
+        NOTE: This method should ignore `self.singular`, see `__len__()` for
+        more information.
+        """
+        if self._len_cache is None:
+            request_kwargs = copy.deepcopy(self.request_kwargs)
+            request_kwargs.setdefault('headers', dict())['Prefer'] = 'count={}'.format(self.count.value)
+            request_kwargs['headers']['Range-Unit'] = 'items'
+            # Have to request something so just fetch the first item
+            request_kwargs['headers']['Range'] = '0-0'
+
+            resp = self.client.session.get(urljoin(self.client.base_uri, self.endpoint), **request_kwargs)
+
+            count = int(resp.headers['Content-Range'].split('/')[-1])
+            self._len_cache = count
+
+            # If the request yields only one element anyway, might as well cache
+            # it. When using singular=False, count=Count.EXACT and the result
+            # is a single element this saves an API request in cases where
+            # len() is called before using the result.
+            if count == 1:
+                self._result_cache = self._parse_response(resp)
+
+    def _fetch_all(self):
+        """
+        Fetch all elements that match the specified params and populate the
+        caches with results.
+        """
+        if self._result_cache is None:
+            request_kwargs = copy.deepcopy(self.request_kwargs)
+
+            if self.singular:
+                request_kwargs.setdefault('headers', dict())['Accept'] = 'application/vnd.pgrst.object+json'
+
+            resp = self.client.session.get(urljoin(self.client.base_uri, self.endpoint), **request_kwargs)
+            self._result_cache = self._parse_response(resp)
+
+            # fetched all elements anyway, caching their length is very cheap
+            self._len_cache = len(self._result_cache)
+
+    def _fetch_range(self, range):
+        """
+        Fetch a range of elements from the PostgREST API.
+        NOTE: This method should ignore `self.singular`, see `__getitem__()` for
+        more information.
+        """
+        request_kwargs = copy.deepcopy(self.request_kwargs)
+        request_kwargs.setdefault('headers', dict())['Range-Unit'] = 'items'
+        request_kwargs['headers']['Range'] = range
+
+        resp = self.client.session.get(urljoin(self.client.base_uri, self.endpoint), **request_kwargs)
+        return self._parse_response(resp)
+
+    def _parse_response(self, resp):
+        """
+        Parse response as json and return the result if it was successful.
+        Attempt to detect common error cases in order to raise meaningful error
+        messages.
+        :return: a list of successfully parsed objects
+        :raises: `ObjectDoesNotExist`/`MultipleObjectsReturned` if `.get()` was
+        used but the API couldn't return exactly one object. `HTTPError` in all
+        other cases e.g. network-related issues.
+        """
+        try:
+            resp.raise_for_status()
+        except requests.HTTPError as e:
+            # try getting a more detailed exception if status_code = 406
+            if resp.status_code == 406:
+                try:
+                    self._try_parse_406(resp)
+                except (ObjectDoesNotExist, MultipleObjectsReturned) as detailed:
+                    raise detailed from e
+
+            # fall back to raising a generic HTTPError exception
+            raise type(e)(resp.status_code, resp.reason, resp.text, response=resp, request=e.request)
+
+        if self.parse_dt:
+            json_result = resp.json(object_hook=datetime_parser)
+        else:
+            json_result = resp.json()
+        # always return a list even if it contains a single element only
+        return [json_result] if self.singular else json_result
+
+    def _try_parse_406(self, resp):
+        """
+        Try parsing a 406 `HTTPError` to raise a more detailed error message.
+        :param resp: the HTTP response to parse
+        :raises: `ObjectDoesNotExist`/`MultipleObjectsReturned` depending on the
+        observed row count.
+        """
+        detail_regex = re.compile(
+            r'Results contain (?P<row_count>\d+) rows, application/vnd\.pgrst\.object\+json requires 1 row'
+        )
+        try:
+            json = resp.json()
+        except ValueError:
+            # Failed parsing as json, give up trying to guess the error.
+            # PostgREST probably changed the error format, log the response for
+            # more insights.
+            logger.warning("Unparsable 406: {}".format(resp.text))
+        else:
+            result = re.match(detail_regex, json['details'])
+            if result is not None:
+                row_count = int(result.group('row_count'))
+                if row_count == 0:
+                    raise ObjectDoesNotExist(json)
+                else:
+                    raise MultipleObjectsReturned(json)
diff --git a/postgrestutils/_django_utils.py b/postgrestutils/_django_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..b08afc7ed401a58e8f05563b983cb330332cb5c2
--- /dev/null
+++ b/postgrestutils/_django_utils.py
@@ -0,0 +1,52 @@
+"""
+Private module to define django-specific utilities.
+It should never be imported directly.
+Instead import the .utils module which will re-export this module's items if
+django is available.
+"""
+
+from datetime import datetime
+from typing import Union
+
+from django.conf import settings
+from django.utils import dateparse
+from django.utils import timezone as django_tz
+
+import postgrestutils
+
+from . import app_settings
+from .signals import user_account_fetched
+
+
+def autofetch(sender, **kwargs):
+    """Fetch user account on login based on the AUTOFETCH configuration"""
+    payload = {
+        'select': app_settings.AUTOFETCH
+    }
+
+    if settings.DEBUG:
+        # prod uuids != dev uuids, fall back on matching accounts by username
+        payload['username'] = 'eq.{}'.format(kwargs['user'].get_username())
+    else:
+        payload['auth_provider_uid'] = 'eq.{}'.format(kwargs['user'].sso_mapping.uuid)
+
+    with postgrestutils.conn() as c:
+        account = c.get('account', params=payload)
+    user_account_fetched.send(sender=None, request=kwargs['request'], account=account)
+
+
+def _try_django_parse_dt(value: str) -> Union[datetime, str]:
+    """
+    Attempt to parse `value` as a `datetime` using django utilities.
+    :param value: the string to parse
+    :return: the parsed `datetime` or the original string if parsing failed
+    """
+    try:
+        parsed_dt = dateparse.parse_datetime(value)
+        if parsed_dt:
+            if django_tz.is_naive(parsed_dt):
+                parsed_dt = django_tz.make_aware(parsed_dt)
+            return parsed_dt
+        return value  # not well formatted
+    except ValueError:
+        return value  # well formatted but not a datetime
diff --git a/postgrestutils/app_settings.py b/postgrestutils/app_settings.py
index 13e06c139e9504802b38ba8e93d3ce2935a19b19..7b8c8a2a7c0641283cbb0078843ec15b1368dc91 100644
--- a/postgrestutils/app_settings.py
+++ b/postgrestutils/app_settings.py
@@ -1,23 +1,31 @@
+import importlib.util
 from typing import Optional
 
 BASE_URI = "http://127.0.0.1:3000"
 
 JWT = None  # type: Optional[str]
 
+SCHEMA = None  # type: Optional[str]
+
 AUTOFETCH = str()
 
-# merge these settings with django.conf.settings
-try:
-    from django.conf import settings as django_settings
-    from django.core.exceptions import ImproperlyConfigured
-    for k in list(globals().keys()):  # list() prevents errors on changes
-        if k.isupper() and not k.startswith("_"):  # looks like a setting
+_DJANGO = importlib.util.find_spec("django") is not None
+
+for k in list(globals().keys()):  # list prevents errors on changes
+    if k.isupper() and not k.startswith("_"):  # looks like a setting
+        if _DJANGO:
+            from django.conf import settings as django_settings
+            from django.core.exceptions import ImproperlyConfigured
             try:
                 new_value = getattr(django_settings, "POSTGREST_UTILS_" + k)
-                globals()[k] = new_value
+                globals()[k] = getattr(django_settings, "POSTGREST_UTILS_" + k)
             except ImproperlyConfigured:
                 pass  # django is installed but not used
             except AttributeError:
-                pass  # django is installed and used, but the setting is not present
-except ImportError:
-    pass  # no django
+                pass  # django is installed and used but the setting is not present
+        else:
+            import os
+            try:
+                globals()[k] = os.environ["POSTGREST_UTILS_" + k]
+            except KeyError:
+                pass  # setting not present
diff --git a/postgrestutils/apps.py b/postgrestutils/apps.py
index 9d219300e765681913e6e599688b7cc543900020..e358eff1a01dc64f56a68b5b464352884cfa4c9b 100644
--- a/postgrestutils/apps.py
+++ b/postgrestutils/apps.py
@@ -1,9 +1,8 @@
 from django.apps import AppConfig
 from django.contrib.auth.signals import user_logged_in
 
-from postgrestutils.helpers import autofetch
-
 from . import app_settings
+from .utils import autofetch
 
 
 class PostgrestUtilsConfig(AppConfig):
diff --git a/postgrestutils/client/__init__.py b/postgrestutils/client/__init__.py
deleted file mode 100644
index 88d9cb0d6c3b4b34ad08bd350db665b3d48602f6..0000000000000000000000000000000000000000
--- a/postgrestutils/client/__init__.py
+++ /dev/null
@@ -1,10 +0,0 @@
-from .. import app_settings
-from .postgrestclient import (
-    Count, MultipleObjectsReturned, ObjectDoesNotExist, PostgrestClient,
-)
-
-# the instance of the client to be used
-pgrest_client = PostgrestClient(app_settings.BASE_URI, app_settings.JWT)
-
-
-__all__ = ['Count', 'pgrest_client', 'ObjectDoesNotExist', 'MultipleObjectsReturned']
diff --git a/postgrestutils/client/postgrestclient.py b/postgrestutils/client/postgrestclient.py
deleted file mode 100644
index 413f5c8800a646519bcdee7c6e60fc5ba92563e5..0000000000000000000000000000000000000000
--- a/postgrestutils/client/postgrestclient.py
+++ /dev/null
@@ -1,262 +0,0 @@
-import copy
-import enum
-import re
-from typing import Optional
-from urllib.parse import urljoin
-
-import requests
-
-from postgrestutils import logger
-from postgrestutils.client.utils import datetime_parser
-
-REPR_OUTPUT_SIZE = 20
-
-Count = enum.Enum('Count', (('NONE', None), ('EXACT', 'exact')))
-
-
-class ObjectDoesNotExist(Exception):
-    pass
-
-
-class MultipleObjectsReturned(Exception):
-    pass
-
-
-class PostgrestClient:
-    def __init__(self, base_uri: str, token: Optional[str]):
-        self.session = requests.Session()
-        self.configure(token, base_uri=base_uri)
-        self.session.headers['Accept'] = 'application/json'
-
-    def configure(self, token: Optional[str], base_uri: Optional[str] = None):
-        """
-        Configure the client to use the specified token and/or base_uri.
-        :param token: the JWT token to use
-        :param base_uri: the base URI of the API to use
-        """
-        if base_uri is not None:
-            self.base_uri = base_uri
-        if token:
-            self.session.headers['Authorization'] = 'Bearer {}'.format(token)
-
-    def get(self, endpoint: str, parse_dt: bool = True, count: Count = Count.NONE, **kwargs):
-        """
-        Get a single object from the specified endpoint. This will most likely
-        require specifying params so that only a single object is found.
-        :param endpoint: specifies which endpoint to request
-        when multiple results are returned)
-        :param parse_dt: if True parses datetime strings as returned by
-        PostgREST to python datetime objects
-        :param count: counting strategy as explained in the README
-        :param kwargs: pass kwargs directly to requests's .get() method
-        :return: single element as dict
-        :raises: ObjectDoesNotExist/MultipleObjectsReturned if no or more than
-        one object was found
-        """
-        res = LazyPostgrestJsonResult(self, endpoint, True, parse_dt, count, **kwargs)
-        # populate the cache
-        # will raise ObjectDoesNotExist/MultipleObjectsReturned if no or
-        # multiple elements are returned
-        res._fetch_all()
-        assert res._result_cache is not None
-        return res._result_cache[0]
-
-    def filter(
-            self,
-            endpoint: str,
-            parse_dt: bool = True,
-            count: Count = Count.NONE,
-            **kwargs) -> 'LazyPostgrestJsonResult':
-        """
-        :param endpoint: specifies which endpoint to request
-        :param parse_dt: if True parses datetime strings as returned by
-        PostgREST to python datetime objects
-        :param count: counting strategy as explained in the README
-        :param kwargs: pass kwargs directly to request's .get() method
-        :return: a lazy python object
-        """
-        return LazyPostgrestJsonResult(self, endpoint, False, parse_dt, count, **kwargs)
-
-
-class LazyPostgrestJsonResult:
-    def __init__(self, client: PostgrestClient, endpoint: str, singular: bool, parse_dt: bool, count: Count, **kwargs):
-        self._len_cache = None  # type: Optional[int]
-        self._result_cache = None  # type: Optional[list]
-
-        self.client = client  # type: PostgrestClient
-        self.endpoint = endpoint  # type: str
-        self.singular = singular  # type: bool
-        self.parse_dt = parse_dt  # type: bool
-        self.count = count  # type: Count
-        self.request_kwargs = kwargs
-
-    def __repr__(self):
-        data = list(self[:REPR_OUTPUT_SIZE + 1])
-        if len(data) > REPR_OUTPUT_SIZE:
-            data[-1] = "...(remaining elements truncated)..."
-        return '<{} {}>'.format(self.__class__.__name__, data)
-
-    def __iter__(self):
-        self._fetch_all()
-        return iter(self._result_cache)
-
-    def __len__(self):
-        """
-        NOTE: Since singular requests (using .get()) return a python dict rather
-        than a LazyPostgrestJsonResult, self.singular should be ignored here.
-        """
-        if self.count != Count.NONE:
-            self._fetch_len()
-        else:
-            self._fetch_all()
-        return self._len_cache
-
-    def __getitem__(self, key):
-        """
-        NOTE: Since singular requests (using .get()) return a python dict rather
-        than a LazyPostgrestJsonResult, self.singular should be ignored here.
-        """
-        if not isinstance(key, (int, slice)):
-            raise TypeError(
-                "{self.__class__.__name__} indices must be integers or slices, not {key.__class__.__name__}".format(
-                    self=self,
-                    key=key
-                )
-            )
-        if ((isinstance(key, int) and key < 0) or
-           (isinstance(key, slice) and ((key.start is not None and key.start < 0) or
-                                        (key.stop is not None and key.stop < 0)))):
-            raise ValueError("{self.__class__.__name__} does not support negative indexing".format(self=self))
-        if isinstance(key, slice) and key.step is not None:
-            raise ValueError("{self.__class__.__name__} does not support stepping".format(self=self))
-
-        # cache is not populated and unbounded slice is requested, i.e. res[:]
-        if isinstance(key, slice) and all(e is None for e in (self._result_cache, key.start, key.stop)):
-            self._fetch_all()
-
-        if self._result_cache is not None:
-            return self._result_cache[key]
-
-        if isinstance(key, slice):
-            start = key.start if key.start is not None else 0
-            if key.stop is not None and key.stop <= start:
-                return list()
-            range = '{start}-{stop}'.format(start=start, stop=key.stop - 1 if key.stop is not None else '')
-            return self._fetch_range(range)
-        return self._fetch_range('{0}-{0}'.format(key))[0]  # single element requested, return dict
-
-    def refresh_from_pgrest(self):
-        """Lazily refresh data from PostgREST"""
-        self._result_cache = None
-        self._len_cache = None
-
-    def _fetch_len(self):
-        """
-        Fetch the length from the PostgREST API.
-        NOTE: This method should ignore self.singular, see __len__() for more
-        information.
-        """
-        if self._len_cache is None:
-            request_kwargs = copy.deepcopy(self.request_kwargs)
-            request_kwargs.setdefault('headers', dict())['Prefer'] = 'count={}'.format(self.count.value)
-            request_kwargs['headers']['Range-Unit'] = 'items'
-            # Have to request something so just fetch the first item
-            request_kwargs['headers']['Range'] = '0-0'
-
-            resp = self.client.session.get(urljoin(self.client.base_uri, self.endpoint), **request_kwargs)
-
-            count = int(resp.headers['Content-Range'].split('/')[-1])
-            self._len_cache = count
-
-            # If the request yields only one element anyway, might as well cache
-            # it. When using singular=False, count=Count.EXACT and the result
-            # is a single element this saves an API request in cases where
-            # len() is called before using the result.
-            if count == 1:
-                self._result_cache = self._parse_response(resp)
-
-    def _fetch_all(self):
-        """
-        Fetch all elements that match the specified params and populate the
-        caches with results.
-        """
-        if self._result_cache is None:
-            request_kwargs = copy.deepcopy(self.request_kwargs)
-
-            if self.singular:
-                request_kwargs.setdefault('headers', dict())['Accept'] = 'application/vnd.pgrst.object+json'
-
-            resp = self.client.session.get(urljoin(self.client.base_uri, self.endpoint), **request_kwargs)
-            self._result_cache = self._parse_response(resp)
-
-            # fetched all elements anyway, caching their length is very cheap
-            self._len_cache = len(self._result_cache)
-
-    def _fetch_range(self, range):
-        """
-        Fetch a range of elements from the PostgREST API.
-        NOTE: This method should ignore self.singular, see __getitem__() for
-        more information.
-        """
-        request_kwargs = copy.deepcopy(self.request_kwargs)
-        request_kwargs.setdefault('headers', dict())['Range-Unit'] = 'items'
-        request_kwargs['headers']['Range'] = range
-
-        resp = self.client.session.get(urljoin(self.client.base_uri, self.endpoint), **request_kwargs)
-        return self._parse_response(resp)
-
-    def _parse_response(self, resp):
-        """
-        Parse response as json and return the result if it was successful.
-        Attempt to detect common error cases in order to raise meaningful error
-        messages.
-        :return: a list of successfully parsed objects
-        :raises: ObjectDoesNotExist/MultipleObjectsReturned if .get() was used
-        but the API couldn't return exactly one object. HTTPError in all other
-        cases e.g. network-related issues.
-        """
-        try:
-            resp.raise_for_status()
-        except requests.HTTPError as e:
-            # try getting a more detailed exception if status_code = 406
-            if resp.status_code == 406:
-                try:
-                    self._try_parse_406(resp)
-                except (ObjectDoesNotExist, MultipleObjectsReturned) as detailed:
-                    raise detailed from e
-
-            # fall back to raising a generic HTTPError exception
-            raise type(e)(resp.status_code, resp.reason, resp.text, response=resp, request=e.request)
-
-        if self.parse_dt:
-            json_result = resp.json(object_hook=datetime_parser)
-        else:
-            json_result = resp.json()
-        # always return a list even if it contains a single element only
-        return [json_result] if self.singular else json_result
-
-    def _try_parse_406(self, resp):
-        """
-        Try parsing a 406 HTTPError to raise a more detailed error message.
-        :param resp: the HTTP response to parse
-        :raises: ObjectDoesNotExist/MultipleObjectsReturned depending on the
-        observed row count.
-        """
-        detail_regex = re.compile(
-            r'Results contain (?P<row_count>\d+) rows, application/vnd\.pgrst\.object\+json requires 1 row'
-        )
-        try:
-            json = resp.json()
-        except ValueError:
-            # Failed parsing as json, give up trying to guess the error.
-            # PostgREST probably changed the error format, log the response for
-            # more insights.
-            logger.warning("Unparsable 406: {}".format(resp.text))
-        else:
-            result = re.match(detail_regex, json['details'])
-            if result is not None:
-                row_count = int(result.group('row_count'))
-                if row_count == 0:
-                    raise ObjectDoesNotExist(json)
-                else:
-                    raise MultipleObjectsReturned(json)
diff --git a/postgrestutils/client/utils.py b/postgrestutils/client/utils.py
deleted file mode 100644
index fa5622191630b458cf813f204457e6125567f4a2..0000000000000000000000000000000000000000
--- a/postgrestutils/client/utils.py
+++ /dev/null
@@ -1,65 +0,0 @@
-import re
-from datetime import datetime, timedelta, timezone
-
-DJANGO = False
-try:
-    from django.utils import dateparse, timezone as django_tz
-    DJANGO = True
-except ImportError:
-    pass
-
-# this regex matches postgres' JSON formatting for timestamps
-JSON_TIMESTAMP_REGEX = re.compile(r"(?P<year>\d{4})-"
-                                  r"(?P<month>\d{2})-"
-                                  r"(?P<day>\d{2})"
-                                  r"T(?P<hour>\d{2}):"
-                                  r"(?P<minute>\d{2}):"
-                                  r"(?P<second>\d{2})\."
-                                  r"(?P<microsecond>\d{1,6})"
-                                  r"((?P<offsetsign>[+-])"
-                                  r"(?P<offsethours>\d{2}):"
-                                  r"(?P<offsetminutes>\d{2}))?$")
-
-
-def clean_parts(parts):
-    cleaned = {}
-    for key, value in parts.items():
-        if value:
-            if "offset" not in key:
-                cleaned[key] = int(value)
-            else:
-                cleaned[key] = value
-    return cleaned
-
-
-def datetime_parser(json_dict):
-    for key, value in json_dict.items():
-        if isinstance(value, str):
-            if DJANGO:
-                try:
-                    parsed_dt = dateparse.parse_datetime(value)
-                    if parsed_dt:
-                        if django_tz.is_naive(parsed_dt):
-                            parsed_dt = django_tz.make_aware(parsed_dt)
-                        json_dict[key] = parsed_dt
-                except ValueError:
-                    pass  # not a datetime, leave the string as it is
-            else:
-                match = JSON_TIMESTAMP_REGEX.match(value)
-                if match:
-                    parts = clean_parts(match.groupdict())
-                    if parts.get('offsetsign') and parts.get('offsethours') and parts.get('offsetminutes'):
-                        sign = -1 if parts.pop('offsetsign', '+') == '-' else 1
-                        tz = timezone(
-                            offset=sign * timedelta(
-                                hours=int(parts.pop('offsethours')),
-                                minutes=int(parts.pop('offsetminutes'))
-                            )
-                        )
-                        parsed_dt = datetime(**parts).replace(tzinfo=tz).astimezone()
-                    else:
-                        # naive datetime so we assume local time
-                        local_tz = datetime.now(timezone.utc).astimezone().tzinfo
-                        parsed_dt = datetime(**parts).replace(tzinfo=local_tz)
-                    json_dict[key] = parsed_dt
-    return json_dict
diff --git a/postgrestutils/helpers.py b/postgrestutils/helpers.py
deleted file mode 100644
index 2e2ec344e8c109c8d9e1dcaf5f47837718528212..0000000000000000000000000000000000000000
--- a/postgrestutils/helpers.py
+++ /dev/null
@@ -1,22 +0,0 @@
-from django.conf import settings
-
-from postgrestutils.client import pgrest_client
-
-from . import app_settings
-from .signals import user_account_fetched
-
-
-def autofetch(sender, **kwargs):
-    """Fetch user account on login based on the AUTOFETCH configuration"""
-    payload = {
-        'select': app_settings.AUTOFETCH
-    }
-
-    if settings.DEBUG:
-        # prod uuids != dev uuids, thus fall back on matching accounts by username
-        payload['username'] = 'eq.{}'.format(kwargs['user'].get_username())
-    else:
-        payload['auth_provider_uid'] = 'eq.{}'.format(kwargs['user'].sso_mapping.uuid)
-
-    account = pgrest_client.get('account', params=payload, singular=True)
-    user_account_fetched.send(sender=None, request=kwargs['request'], account=account)
diff --git a/postgrestutils/utils.py b/postgrestutils/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..6eab7a3e9484e5ceb26f04ca7e2542aa1b2cde50
--- /dev/null
+++ b/postgrestutils/utils.py
@@ -0,0 +1,77 @@
+import importlib.util
+import logging
+import re
+from datetime import datetime, timedelta, timezone
+from typing import Dict, Union
+
+_DJANGO = importlib.util.find_spec("django") is not None
+if _DJANGO:
+    from ._django_utils import _try_django_parse_dt, autofetch
+
+
+logger = logging.getLogger("postgrestutils")
+
+# this regex matches postgres' JSON formatting for timestamps
+JSON_TIMESTAMP_REGEX = re.compile(r"(?P<year>\d{4})-"
+                                  r"(?P<month>\d{2})-"
+                                  r"(?P<day>\d{2})"
+                                  r"T(?P<hour>\d{2}):"
+                                  r"(?P<minute>\d{2}):"
+                                  r"(?P<second>\d{2})\."
+                                  r"(?P<microsecond>\d{1,6})"
+                                  r"((?P<offsetsign>[+-])"
+                                  r"(?P<offsethours>\d{2}):"
+                                  r"(?P<offsetminutes>\d{2}))?$")
+
+
+def _clean_parts(parts: Dict[str, str]):
+    cleaned = {}  # type: Dict[str, Union[int, str]]
+    for key, value in parts.items():
+        if value:
+            if "offset" not in key:
+                cleaned[key] = int(value)
+            else:
+                cleaned[key] = value
+    return cleaned
+
+
+def _try_python_parse_dt(value: str) -> Union[datetime, str]:
+    """
+    Attempt to parse value as a datetime using only python utilities.
+    :param value: the string to parse
+    :return: the parsed `datetime` or the original string if parsing failed
+    """
+    match = JSON_TIMESTAMP_REGEX.match(value)
+    if match:
+        parts = _clean_parts(match.groupdict())
+        if parts.get('offsetsign') and parts.get('offsethours') and parts.get('offsetminutes'):
+            sign = -1 if parts.pop('offsetsign', '+') == '-' else 1
+            tz = timezone(
+                offset=sign * timedelta(
+                    hours=int(parts.pop('offsethours')),
+                    minutes=int(parts.pop('offsetminutes'))
+                )
+            )
+            parsed_dt = datetime(**parts).replace(tzinfo=tz).astimezone()
+        else:
+            # naive datetime so we assume local time
+            local_tz = datetime.now(timezone.utc).astimezone().tzinfo
+            parsed_dt = datetime(**parts).replace(tzinfo=local_tz)
+        return parsed_dt
+    return value
+
+
+_try_parse_dt = _try_django_parse_dt if _DJANGO else _try_python_parse_dt
+
+
+def datetime_parser(json_dict: dict) -> dict:
+    """
+    A function to use as `object_hook` when deserializing JSON that parses
+    datetime strings to timezone-aware datetime objects.
+    :param json_dict: the original `json_dict` to process
+    :return: the modified `json_dict`
+    """
+    for key, value in json_dict.items():
+        if isinstance(value, str):
+            json_dict[key] = _try_parse_dt(value)
+    return json_dict
diff --git a/tests/test_postgrestclient.py b/tests/test_postgrestclient.py
index f682e2e1352f3c01156b0795330bb506952cf9b5..2c93e8086d9541ac11ad51276cfdf1ee27ac9673 100644
--- a/tests/test_postgrestclient.py
+++ b/tests/test_postgrestclient.py
@@ -1,14 +1,15 @@
 import datetime
+import functools
 from unittest import TestCase
 
 from requests_mock import Mocker
 
-from postgrestutils.client import (
-    Count, MultipleObjectsReturned, ObjectDoesNotExist,
-)
-from postgrestutils.client.postgrestclient import LazyPostgrestJsonResult
+import postgrestutils
 
 TOKEN = 'JWT_token'
+
+default_session = functools.partial(postgrestutils.Session, base_uri='http://example.com/', token=TOKEN)
+
 DEFAULT_HEADERS = {
     'Authorization': 'Bearer {}'.format(TOKEN),
     'Accept': 'application/json'
@@ -82,10 +83,6 @@ SUPERHERO_TEST_DATA = [
 class TestPgrestClientGet(TestCase):
     def setUp(self):
         super().setUp()
-        from postgrestutils.client import pgrest_client
-        pgrest_client.configure(TOKEN, base_uri='http://example.com/')
-
-        self.pgrest_client = pgrest_client
         self.data = SUPERHERO_TEST_DATA[0]
 
     def test_single_object_returned(self, mock):
@@ -97,10 +94,11 @@ class TestPgrestClientGet(TestCase):
             reason='OK',
             json=self.data
         )
-        params = {'id': 'eq.1000000000'}
-        res = self.pgrest_client.get('superhero', params=params)
+        with default_session() as s:
+            params = {'id': 'eq.1000000000'}
+            res = s.get('superhero', params=params)
 
-        self.assertDictEqual(res, self.data)
+        self.assertEqual(res, self.data)
         self.assertTrue(mock.called_once)
 
     def test_object_does_not_exist(self, mock):
@@ -113,10 +111,10 @@ class TestPgrestClientGet(TestCase):
             text="""{"details":"Results contain 0 rows, application/vnd.pgrst.object+json requires 1 row","message":"""
                  """"JSON object requested, multiple (or no) rows returned"}"""
         )
-        params = {'id': 'eq.1337'}
 
-        with self.assertRaises(ObjectDoesNotExist):
-            self.pgrest_client.get('superhero', params=params)
+        with default_session() as s, self.assertRaises(postgrestutils.ObjectDoesNotExist):
+            params = {'id': 'eq.1337'}
+            s.get('superhero', params=params)
         self.assertTrue(mock.called_once)
 
     def test_multiple_objects_returned(self, mock):
@@ -130,8 +128,8 @@ class TestPgrestClientGet(TestCase):
                  """"JSON object requested, multiple (or no) rows returned"}"""
         )
 
-        with self.assertRaises(MultipleObjectsReturned):
-            self.pgrest_client.get('superhero')
+        with default_session() as s, self.assertRaises(postgrestutils.MultipleObjectsReturned):
+            s.get('superhero')
         self.assertTrue(mock.called_once)
 
     def test_datetime_parser(self, mock):
@@ -147,10 +145,11 @@ class TestPgrestClientGet(TestCase):
             reason='OK',
             json={'id': 1337, 'random': "2020-05-20T08:35:06.659425+00:00"}
         )
-        params = {'id': 'eq.1337'}
-        res = self.pgrest_client.get('random_datetime', params=params)
+        with default_session() as s:
+            params = {'id': 'eq.1337'}
+            res = s.get('random_datetime', params=params)
 
-        self.assertDictEqual(res, expected)
+        self.assertEqual(res, expected)
         self.assertTrue(mock.called_once)
 
     def test_without_datetime_parser(self, mock):
@@ -163,10 +162,11 @@ class TestPgrestClientGet(TestCase):
             reason='OK',
             json=test_json
         )
-        params = {'select': 'id,random', 'id': 'eq.1337'}
-        res = self.pgrest_client.get('random_datetime', params=params, parse_dt=False)
+        with default_session() as s:
+            params = {'select': 'id,random', 'id': 'eq.1337'}
+            res = s.get('random_datetime', params=params, parse_dt=False)
 
-        self.assertDictEqual(res, test_json)
+        self.assertEqual(res, test_json)
         self.assertTrue(mock.called_once)
 
 
@@ -174,10 +174,6 @@ class TestPgrestClientGet(TestCase):
 class TestPgrestClientFilterStrategyNone(TestCase):
     def setUp(self):
         super().setUp()
-        from postgrestutils.client import pgrest_client
-        pgrest_client.configure(TOKEN, base_uri='http://example.com/')
-
-        self.pgrest_client = pgrest_client
         self.data = SUPERHERO_TEST_DATA
 
     def test_fetch_all_first(self, mock):
@@ -189,22 +185,23 @@ class TestPgrestClientFilterStrategyNone(TestCase):
             reason='OK',
             json=self.data
         )
-        res = self.pgrest_client.filter('superhero')
-
-        self.assertIsInstance(res, LazyPostgrestJsonResult)  # should return lazy object
-        self.assertFalse(mock.called)  # no request should have been made yet
-
-        self.assertListEqual(list(res), self.data)  # fetch data
-        self.assertTrue(mock.called_once)  # should have been called once
-        self.assertListEqual(res._result_cache, self.data)  # fetched data should be cached
-        self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
-        self.assertListEqual(list(res), self.data)  # should utilize cache
-        self.assertListEqual(res[:1], self.data[:1])  # should utilize cache
-        self.assertListEqual(res[:0], self.data[:0])  # should return empty list
-        self.assertListEqual(res[4:2], self.data[4:2])  # should return empty list
-        self.assertListEqual(res[2:], self.data[2:])  # should utilize cache
-        self.assertDictEqual(res[0], self.data[0])  # should utilize cache
-        self.assertTrue(mock.called_once)  # should not have been called again
+        with default_session() as s:
+            res = s.filter('superhero')
+
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+
+            self.assertEqual(list(res), self.data)  # fetch data
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._result_cache, self.data)  # fetched data should be cached
+            self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
+            self.assertEqual(list(res), self.data)  # should utilize cache
+            self.assertEqual(res[:1], self.data[:1])  # should utilize cache
+            self.assertEqual(res[:0], self.data[:0])  # should return empty list
+            self.assertEqual(res[4:2], self.data[4:2])  # should return empty list
+            self.assertEqual(res[2:], self.data[2:])  # should utilize cache
+            self.assertEqual(res[0], self.data[0])  # should utilize cache
+            self.assertTrue(mock.called_once)  # should not have been called again
 
     def test_fetch_len_first(self, mock):
         mock.register_uri(
@@ -215,22 +212,23 @@ class TestPgrestClientFilterStrategyNone(TestCase):
             reason='OK',
             json=self.data
         )
-        res = self.pgrest_client.filter('superhero')
-
-        self.assertIsInstance(res, LazyPostgrestJsonResult)  # should return lazy object
-        self.assertFalse(mock.called)  # no request should have been made yet
-
-        self.assertEqual(len(res), len(self.data))  # should fetch len
-        self.assertTrue(mock.called_once)  # should have been called once
-        self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
-        self.assertListEqual(res._result_cache, self.data)  # results should be cached (counting strategy none)
-        self.assertListEqual(res[:1], self.data[:1])  # should utilize cache
-        self.assertListEqual(res[:0], self.data[:0])  # should return empty list
-        self.assertListEqual(res[4:2], self.data[4:2])  # should return empty list
-        self.assertListEqual(res[2:], self.data[2:])  # should utilize cache
-        self.assertDictEqual(res[0], self.data[0])  # should utilize cache
-        self.assertListEqual(list(res), self.data)  # should utilize cache
-        self.assertTrue(mock.called_once)  # should not have been called again
+        with default_session() as s:
+            res = s.filter('superhero')
+
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+
+            self.assertEqual(len(res), len(self.data))  # should fetch len
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
+            self.assertEqual(res._result_cache, self.data)  # results should be cached (counting strategy none)
+            self.assertEqual(res[:1], self.data[:1])  # should utilize cache
+            self.assertEqual(res[:0], self.data[:0])  # should return empty list
+            self.assertEqual(res[4:2], self.data[4:2])  # should return empty list
+            self.assertEqual(res[2:], self.data[2:])  # should utilize cache
+            self.assertEqual(res[0], self.data[0])  # should utilize cache
+            self.assertEqual(list(res), self.data)  # should utilize cache
+            self.assertTrue(mock.called_once)  # should not have been called again
 
     def test_cache_fetching_unbounded_slice(self, mock):
         mock.register_uri(
@@ -241,31 +239,28 @@ class TestPgrestClientFilterStrategyNone(TestCase):
             reason='OK',
             json=self.data
         )
-        res = self.pgrest_client.filter('superhero')
+        with default_session() as s:
+            res = s.filter('superhero')
 
-        self.assertIsInstance(res, LazyPostgrestJsonResult)  # should return lazy object
-        self.assertFalse(mock.called)  # no request should have been made yet
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
 
-        self.assertListEqual(res[:], self.data)  # fetch data
-        self.assertTrue(mock.called_once)  # should have been called once
-        self.assertListEqual(res._result_cache, self.data)  # fetched data should be cached
-        self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
-        self.assertListEqual(res[:], self.data)  # should utilize cache
-        self.assertListEqual(res[:0], self.data[:0])  # should return empty list
-        self.assertListEqual(res[4:2], self.data[4:2])  # should return empty list
-        self.assertListEqual(res[2:], self.data[2:])  # should utilize cache
-        self.assertDictEqual(res[0], self.data[0])  # should utilize cache
-        self.assertTrue(mock.called_once)  # should not have been called again
+            self.assertEqual(res[:], self.data)  # fetch data
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._result_cache, self.data)  # fetched data should be cached
+            self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
+            self.assertEqual(res[:], self.data)  # should utilize cache
+            self.assertEqual(res[:0], self.data[:0])  # should return empty list
+            self.assertEqual(res[4:2], self.data[4:2])  # should return empty list
+            self.assertEqual(res[2:], self.data[2:])  # should utilize cache
+            self.assertEqual(res[0], self.data[0])  # should utilize cache
+            self.assertTrue(mock.called_once)  # should not have been called again
 
 
 @Mocker()
 class TestPgrestClientFilterStrategyExact(TestCase):
     def setUp(self):
         super().setUp()
-        from postgrestutils.client import pgrest_client
-        pgrest_client.configure(TOKEN, base_uri='http://example.com/')
-
-        self.pgrest_client = pgrest_client
         self.data = SUPERHERO_TEST_DATA
 
     def test_fetch_all_first(self, mock):
@@ -278,22 +273,23 @@ class TestPgrestClientFilterStrategyExact(TestCase):
             reason='OK',
             json=self.data
         )
-        res = self.pgrest_client.filter('superhero', count=Count.EXACT)
-
-        self.assertIsInstance(res, LazyPostgrestJsonResult)  # should return lazy object
-        self.assertFalse(mock.called)  # no request should have been made yet
-
-        self.assertListEqual(list(res), self.data)  # fetch data
-        self.assertTrue(mock.called_once)  # should have been called once
-        self.assertListEqual(res._result_cache, self.data)  # fetched data should be cached
-        self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should also be cached
-        self.assertListEqual(list(res), self.data)  # should utilize cache
-        self.assertListEqual(res[:1], self.data[:1])  # should utilize cache
-        self.assertListEqual(res[:0], self.data[:0])  # should return empty list
-        self.assertListEqual(res[4:2], self.data[4:2])  # should return empty list
-        self.assertListEqual(res[2:], self.data[2:])  # should utilize cache
-        self.assertDictEqual(res[0], self.data[0])  # should utilize cache
-        self.assertTrue(mock.called_once)  # should not have been called again
+        with default_session(count=postgrestutils.Count.EXACT) as s:
+            res = s.filter('superhero')
+
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+
+            self.assertEqual(list(res), self.data)  # fetch data
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._result_cache, self.data)  # fetched data should be cached
+            self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should also be cached
+            self.assertEqual(list(res), self.data)  # should utilize cache
+            self.assertEqual(res[:1], self.data[:1])  # should utilize cache
+            self.assertEqual(res[:0], self.data[:0])  # should return empty list
+            self.assertEqual(res[4:2], self.data[4:2])  # should return empty list
+            self.assertEqual(res[2:], self.data[2:])  # should utilize cache
+            self.assertEqual(res[0], self.data[0])  # should utilize cache
+            self.assertTrue(mock.called_once)  # should not have been called again
 
     def test_fetch_len_first(self, mock):
         # in order to fetch all
@@ -335,20 +331,138 @@ class TestPgrestClientFilterStrategyExact(TestCase):
             headers={'Content-Range': '0-0/5'},
             json=self.data[0]
         )
-        res = self.pgrest_client.filter('superhero', count=Count.EXACT)
-
-        self.assertIsInstance(res, LazyPostgrestJsonResult)  # should return lazy object
-        self.assertFalse(mock.called)  # no request should have been made yet
-
-        self.assertEqual(len(res), len(self.data))  # should fetch len
-        self.assertTrue(mock.called_once)  # should have been called once
-        self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
-        self.assertListEqual(res[:1], self.data[:1])  # should fetch first element as range
-        self.assertListEqual(res[:0], self.data[:0])  # should return empty list
-        self.assertListEqual(res[4:2], self.data[4:2])  # should return empty list
-        self.assertListEqual(res[2:], self.data[2:])  # should fetch range starting at index 2
-        self.assertDictEqual(res[0], self.data[0])  # should fetch first element as range but return dict
-        self.assertListEqual(list(res), self.data)  # should fetch all elements
-        self.assertListEqual(res._result_cache, self.data)  # should cache all elements
-        self.assertTrue(mock.called)  # should have been called at least once
-        self.assertEqual(mock.call_count, 5)  # should have only been called 5 times (fetch len, range, first and all)
+        with default_session(count=postgrestutils.Count.EXACT) as s:
+            res = s.filter('superhero')
+
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+
+            self.assertEqual(len(res), len(self.data))  # should fetch len
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
+            self.assertEqual(res[:1], self.data[:1])  # should fetch first element as range
+            self.assertEqual(res[:0], self.data[:0])  # should return empty list
+            self.assertEqual(res[4:2], self.data[4:2])  # should return empty list
+            self.assertEqual(res[2:], self.data[2:])  # should fetch range starting at index 2
+            self.assertEqual(res[0], self.data[0])  # should fetch first element as range but return dict
+            self.assertEqual(list(res), self.data)  # should fetch all elements
+            self.assertEqual(res._result_cache, self.data)  # should cache all elements
+            self.assertTrue(mock.called)  # should have been called at least once
+            self.assertEqual(mock.call_count, 5)  # should have been called 5 times (fetch len, range, first and all)
+
+
+@Mocker()
+class TestPgrestClientSessionDefaults(TestCase):
+    def setUp(self):
+        super().setUp()
+        self.data = SUPERHERO_TEST_DATA
+
+    def test_override_parse_dt_session_option(self, mock):
+        test_json = {'id': 1337, 'random': "2020-05-20T08:35:06.659425+00:00"}
+        mock.register_uri(
+            'GET',
+            'http://example.com/random_datetime',
+            request_headers={**DEFAULT_HEADERS, **{'Accept': 'application/vnd.pgrst.object+json'}},
+            status_code=200,
+            reason='OK',
+            json=test_json
+        )
+        with default_session(parse_dt=False) as s:
+            params = {'select': 'id,random', 'id': 'eq.1337'}
+            res = s.get('random_datetime', params=params)
+            self.assertEqual(res, test_json)
+            self.assertTrue(mock.called_once)
+
+            mock.reset()
+
+            res2 = s.get('random_datetime', params=params, parse_dt=True)
+            expected = {
+                'id': 1337,
+                'random': datetime.datetime(2020, 5, 20, 8, 35, 6, 659425, tzinfo=datetime.timezone.utc)
+            }
+            self.assertEqual(res2, expected)
+            self.assertTrue(mock.called_once)
+
+    def test_override_count_session_option(self, mock):
+        # in order to fetch all
+        mock.register_uri(
+            'GET',
+            'http://example.com/superhero',
+            request_headers=DEFAULT_HEADERS,
+            status_code=200,
+            reason='OK',
+            json=self.data
+        )
+        # in order to fetch length
+        mock.register_uri(
+            'GET',
+            'http://example.com/superhero',
+            request_headers={**DEFAULT_HEADERS, **{'Range-Unit': 'items', 'Range': '0-0', 'Prefer': 'count=exact'}},
+            status_code=206,
+            reason='Partial Content',
+            headers={'Content-Range': '0-0/5'},
+            json=self.data[0]
+        )
+        with default_session(count=postgrestutils.Count.EXACT) as s:
+            res = s.filter('superhero')
+
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+
+            self.assertEqual(len(res), len(self.data))  # should fetch len
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._len_cache, len(self.data))  # len of fetched data should be cached
+            self.assertNotEqual(res._result_cache, self.data)  # should not have cached all elements
+
+            mock.reset()  # reset mock
+
+            # override the count session option in this specific request
+            res2 = s.filter('superhero', count=postgrestutils.Count.NONE)
+
+            self.assertIsInstance(res2, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+
+            self.assertEqual(len(res2), len(self.data))  # should fetch all elements to get len
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res2._len_cache, len(self.data))  # len of fetched data should be cached
+            self.assertEqual(res2._result_cache, self.data)  # should have cached all elements
+
+    def test_override_schema_session_option(self, mock):
+        # in order to fetch all
+        mock.register_uri(
+            'GET',
+            'http://example.com/superhero',
+            request_headers=DEFAULT_HEADERS,
+            status_code=200,
+            reason='OK',
+            json=self.data
+        )
+        # in order to fetch all (other schema)
+        mock.register_uri(
+            'GET',
+            'http://example.com/superhero',
+            request_headers={**DEFAULT_HEADERS, **{'Accept-Profile': 'other_schema'}},
+            status_code=200,
+            reason='OK',
+            json=self.data
+        )
+        with default_session(schema='other_schema') as s:
+            res = s.filter('superhero')
+
+            self.assertIsInstance(res, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request should have been made yet
+            self.assertEqual(list(res), self.data)  # should fetch all elements
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res._result_cache, self.data)  # should have cached all elements
+            self.assertEqual(res._len_cache, len(self.data))  # should have cached the length
+
+            mock.reset()
+
+            res2 = s.filter('superhero', schema=postgrestutils.DEFAULT_SCHEMA)
+
+            self.assertIsInstance(res2, postgrestutils.JsonResultSet)  # should return lazy object
+            self.assertFalse(mock.called)  # no request.should have been made yet
+            self.assertEqual(list(res2), self.data)  # should fetch all elements
+            self.assertTrue(mock.called_once)  # should have been called once
+            self.assertEqual(res2._result_cache, self.data)  # should have cached all elements
+            self.assertEqual(res2._len_cache, len(self.data))  # should have cached the length