{"id": "authentication:authentication-root", "page": "authentication", "ref": "authentication-root", "title": "Using the \"root\" actor", "content": "Datasette currently leaves almost all forms of authentication to plugins - datasette-auth-github for example. \n The one exception is the \"root\" account, which you can sign into while using Datasette on your local machine. This provides access to a small number of debugging features. \n To sign in as root, start Datasette using the --root command-line option, like this: \n $ datasette --root\nhttp://127.0.0.1:8001/-/auth-token?token=786fc524e0199d70dc9a581d851f466244e114ca92f33aa3b42a139e9388daa7\nINFO: Started server process [25801]\nINFO: Waiting for application startup.\nINFO: Application startup complete.\nINFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) \n The URL on the first line includes a one-use token which can be used to sign in as the \"root\" actor in your browser. Click on that link and then visit http://127.0.0.1:8001/-/actor to confirm that you are authenticated as an actor that looks like this: \n {\n \"id\": \"root\"\n}", "breadcrumbs": "[\"Authentication and permissions\", \"Actors\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}]"} {"id": "authentication:permissions-view-instance", "page": "authentication", "ref": "permissions-view-instance", "title": "view-instance", "content": "Top level permission - Actor is allowed to view any pages within this instance, starting at https://latest.datasette.io/ \n Default allow .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/\", \"label\": \"https://latest.datasette.io/\"}]"} {"id": "authentication:permissions-view-database", "page": "authentication", "ref": "permissions-view-database", "title": "view-database", "content": "Actor is allowed to view a database page, e.g. https://latest.datasette.io/fixtures \n \n \n resource - string \n \n The name of the database \n \n \n \n Default allow .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures\", \"label\": \"https://latest.datasette.io/fixtures\"}]"} {"id": "authentication:permissions-view-database-download", "page": "authentication", "ref": "permissions-view-database-download", "title": "view-database-download", "content": "Actor is allowed to download a database, e.g. https://latest.datasette.io/fixtures.db \n \n \n resource - string \n \n The name of the database \n \n \n \n Default allow .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures.db\", \"label\": \"https://latest.datasette.io/fixtures.db\"}]"} {"id": "authentication:permissions-view-table", "page": "authentication", "ref": "permissions-view-table", "title": "view-table", "content": "Actor is allowed to view a table (or view) page, e.g. https://latest.datasette.io/fixtures/complex_foreign_keys \n \n \n resource - tuple: (string, string) \n \n The name of the database, then the name of the table \n \n \n \n Default allow .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/complex_foreign_keys\", \"label\": \"https://latest.datasette.io/fixtures/complex_foreign_keys\"}]"} {"id": "authentication:permissions-view-query", "page": "authentication", "ref": "permissions-view-query", "title": "view-query", "content": "Actor is allowed to view (and execute) a canned query page, e.g. https://latest.datasette.io/fixtures/pragma_cache_size - this includes executing Writable canned queries . \n \n \n resource - tuple: (string, string) \n \n The name of the database, then the name of the canned query \n \n \n \n Default allow .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/pragma_cache_size\", \"label\": \"https://latest.datasette.io/fixtures/pragma_cache_size\"}]"} {"id": "authentication:permissions-execute-sql", "page": "authentication", "ref": "permissions-execute-sql", "title": "execute-sql", "content": "Actor is allowed to run arbitrary SQL queries against a specific database, e.g. https://latest.datasette.io/fixtures?sql=select+100 \n \n \n resource - string \n \n The name of the database \n \n \n \n Default allow . See also the default_allow_sql setting .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures?sql=select+100\", \"label\": \"https://latest.datasette.io/fixtures?sql=select+100\"}]"} {"id": "authentication:permissions-permissions-debug", "page": "authentication", "ref": "permissions-permissions-debug", "title": "permissions-debug", "content": "Actor is allowed to view the /-/permissions debug page. \n Default deny .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[]"} {"id": "authentication:permissions-debug-menu", "page": "authentication", "ref": "permissions-debug-menu", "title": "debug-menu", "content": "Controls if the various debug pages are displayed in the navigation menu. \n Default deny .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in permissions\"]", "references": "[]"} {"id": "authentication:authentication-permissions-instance", "page": "authentication", "ref": "authentication-permissions-instance", "title": "Controlling access to an instance", "content": "Here's how to restrict access to your entire Datasette instance to just the \"id\": \"root\" user: \n {\n \"title\": \"My private Datasette instance\",\n \"allow\": {\n \"id\": \"root\"\n }\n} \n To deny access to all users, you can use \"allow\": false : \n {\n \"title\": \"My entirely inaccessible instance\",\n \"allow\": false\n} \n One reason to do this is if you are using a Datasette plugin - such as datasette-permissions-sql - to control permissions instead.", "breadcrumbs": "[\"Authentication and permissions\", \"Configuring permissions in metadata.json\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-permissions-sql\", \"label\": \"datasette-permissions-sql\"}]"} {"id": "authentication:authentication-permissions-database", "page": "authentication", "ref": "authentication-permissions-database", "title": "Controlling access to specific databases", "content": "To limit access to a specific private.db database to just authenticated users, use the \"allow\" block like this: \n {\n \"databases\": {\n \"private\": {\n \"allow\": {\n \"id\": \"*\"\n }\n }\n }\n}", "breadcrumbs": "[\"Authentication and permissions\", \"Configuring permissions in metadata.json\"]", "references": "[]"} {"id": "authentication:authentication-permissions-table", "page": "authentication", "ref": "authentication-permissions-table", "title": "Controlling access to specific tables and views", "content": "To limit access to the users table in your bakery.db database: \n {\n \"databases\": {\n \"bakery\": {\n \"tables\": {\n \"users\": {\n \"allow\": {\n \"id\": \"*\"\n }\n }\n }\n }\n }\n} \n This works for SQL views as well - you can list their names in the \"tables\" block above in the same way as regular tables. \n \n Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries, like this for example. \n If you are restricting access to specific tables you should also use the \"allow_sql\" block to prevent users from bypassing the limit with their own SQL queries - see Controlling the ability to execute arbitrary SQL .", "breadcrumbs": "[\"Authentication and permissions\", \"Configuring permissions in metadata.json\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures?sql=select+*+from+facetable\", \"label\": \"like this\"}]"} {"id": "authentication:authentication-permissions-query", "page": "authentication", "ref": "authentication-permissions-query", "title": "Controlling access to specific canned queries", "content": "Canned queries allow you to configure named SQL queries in your metadata.json that can be executed by users. These queries can be set up to both read and write to the database, so controlling who can execute them can be important. \n To limit access to the add_name canned query in your dogs.db database to just the root user : \n {\n \"databases\": {\n \"dogs\": {\n \"queries\": {\n \"add_name\": {\n \"sql\": \"INSERT INTO names (name) VALUES (:name)\",\n \"write\": true,\n \"allow\": {\n \"id\": [\"root\"]\n }\n }\n }\n }\n }\n}", "breadcrumbs": "[\"Authentication and permissions\", \"Configuring permissions in metadata.json\"]", "references": "[]"} {"id": "authentication:authentication-permissions-execute-sql", "page": "authentication", "ref": "authentication-permissions-execute-sql", "title": "Controlling the ability to execute arbitrary SQL", "content": "Datasette defaults to allowing any site visitor to execute their own custom SQL queries, for example using the form on the database page or by appending a ?_where= parameter to the table page like this . \n Access to this ability is controlled by the execute-sql permission. \n The easiest way to disable arbitrary SQL queries is using the default_allow_sql setting when you first start Datasette running. \n You can alternatively use an \"allow_sql\" block to control who is allowed to execute arbitrary SQL queries. \n To prevent any user from executing arbitrary SQL queries, use this: \n {\n \"allow_sql\": false\n} \n To enable just the root user to execute SQL for all databases in your instance, use the following: \n {\n \"allow_sql\": {\n \"id\": \"root\"\n }\n} \n To limit this ability for just one specific database, use this: \n {\n \"databases\": {\n \"mydatabase\": {\n \"allow_sql\": {\n \"id\": \"root\"\n }\n }\n }\n}", "breadcrumbs": "[\"Authentication and permissions\", \"Configuring permissions in metadata.json\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures\", \"label\": \"the database page\"}, {\"href\": \"https://latest.datasette.io/fixtures/facetable?_where=_city_id=1\", \"label\": \"like this\"}]"} {"id": "authentication:authentication-permissions-allow", "page": "authentication", "ref": "authentication-permissions-allow", "title": "Defining permissions with \"allow\" blocks", "content": "The standard way to define permissions in Datasette is to use an \"allow\" block. This is a JSON document describing which actors are allowed to perform a permission. \n The most basic form of allow block is this ( allow demo , deny demo ): \n {\n \"allow\": {\n \"id\": \"root\"\n }\n} \n This will match any actors with an \"id\" property of \"root\" - for example, an actor that looks like this: \n {\n \"id\": \"root\",\n \"name\": \"Root User\"\n} \n An allow block can specify \"deny all\" using false ( demo ): \n {\n \"allow\": false\n} \n An \"allow\" of true allows all access ( demo ): \n {\n \"allow\": true\n} \n Allow keys can provide a list of values. These will match any actor that has any of those values ( allow demo , deny demo ): \n {\n \"allow\": {\n \"id\": [\"simon\", \"cleopaws\"]\n }\n} \n This will match any actor with an \"id\" of either \"simon\" or \"cleopaws\" . \n Actors can have properties that feature a list of values. These will be matched against the list of values in an allow block. Consider the following actor: \n {\n \"id\": \"simon\",\n \"roles\": [\"staff\", \"developer\"]\n} \n This allow block will provide access to any actor that has \"developer\" as one of their roles ( allow demo , deny demo ): \n {\n \"allow\": {\n \"roles\": [\"developer\"]\n }\n} \n Note that \"roles\" is not a concept that is baked into Datasette - it's a convention that plugins can choose to implement and act on. \n If you want to provide access to any actor with a value for a specific key, use \"*\" . For example, to match any logged-in user specify the following ( allow demo , deny demo ): \n {\n \"allow\": {\n \"id\": \"*\"\n }\n} \n You can specify that only unauthenticated actors (from anynomous HTTP requests) should be allowed access using the special \"unauthenticated\": true key in an allow block ( allow demo , deny demo ): \n {\n \"allow\": {\n \"unauthenticated\": true\n }\n} \n Allow keys act as an \"or\" mechanism. An actor will be able to execute the query if any of their JSON properties match any of the values in the corresponding lists in the allow block. The following block will allow users with either a role of \"ops\" OR users who have an id of \"simon\" or \"cleopaws\" : \n {\n \"allow\": {\n \"id\": [\"simon\", \"cleopaws\"],\n \"role\": \"ops\"\n }\n} \n Demo for cleopaws , demo for ops role , demo for an actor matching neither rule .", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22root%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22trevor%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=false\", \"label\": \"demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=true\", \"label\": \"demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22pancakes%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%2C%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22staff%22%2C%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%2C%0D%0A++++%22roles%22%3A+%5B%22dog%22%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22bot%22%3A+%22readme-bot%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=null&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22hello%22%0D%0A%7D&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"Demo for cleopaws\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22trevor%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22ops%22%2C%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"demo for ops role\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22percy%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"demo for an actor matching neither rule\"}]"} {"id": "authentication:allowdebugview", "page": "authentication", "ref": "allowdebugview", "title": "The /-/allow-debug tool", "content": "The /-/allow-debug tool lets you try out different \"action\" blocks against different \"actor\" JSON objects. You can try that out here: https://latest.datasette.io/-/allow-debug", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/-/allow-debug\", \"label\": \"https://latest.datasette.io/-/allow-debug\"}]"} {"id": "authentication:authentication-ds-actor-expiry", "page": "authentication", "ref": "authentication-ds-actor-expiry", "title": "Including an expiry time", "content": "ds_actor cookies can optionally include a signed expiry timestamp, after which the cookies will no longer be valid. Authentication plugins may chose to use this mechanism to limit the lifetime of the cookie. For example, if a plugin implements single-sign-on against another source it may decide to set short-lived cookies so that if the user is removed from the SSO system their existing Datasette cookies will stop working shortly afterwards. \n To include an expiry, add a \"e\" key to the cookie value containing a base62-encoded integer representing the timestamp when the cookie should expire. For example, here's how to set a cookie that expires after 24 hours: \n import time\nfrom datasette.utils import baseconv\n\nexpires_at = int(time.time()) + (24 * 60 * 60)\n\nresponse = Response.redirect(\"/\")\nresponse.set_cookie(\n \"ds_actor\",\n datasette.sign(\n {\n \"a\": {\"id\": \"cleopaws\"},\n \"e\": baseconv.base62.encode(expires_at),\n },\n \"actor\",\n ),\n) \n The resulting cookie will encode data that looks something like this: \n {\n \"a\": {\n \"id\": \"cleopaws\"\n },\n \"e\": \"1jjSji\"\n}", "breadcrumbs": "[\"Authentication and permissions\", \"The ds_actor cookie\"]", "references": "[]"} {"id": "authentication:logoutview", "page": "authentication", "ref": "logoutview", "title": "The /-/logout page", "content": "The page at /-/logout provides the ability to log out of a ds_actor cookie authentication session.", "breadcrumbs": "[\"Authentication and permissions\", \"The ds_actor cookie\"]", "references": "[]"} {"id": "authentication:authentication-actor", "page": "authentication", "ref": "authentication-actor", "title": "Actors", "content": "Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API agents (via authentication tokens). The word \"actor\" is used to cover both of these cases. \n Every request to Datasette has an associated actor value, available in the code as request.actor . This can be None for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API agents. \n The actor dictionary can be any shape - the design of that data structure is left up to the plugins. A useful convention is to include an \"id\" string, as demonstrated by the \"root\" actor below. \n Plugins can use the actor_from_request(datasette, request) hook to implement custom logic for authenticating an actor based on the incoming HTTP request.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:authentication-permissions", "page": "authentication", "ref": "authentication-permissions", "title": "Permissions", "content": "Datasette has an extensive permissions system built-in, which can be further extended and customized by plugins. \n The key question the permissions system answers is this: \n \n Is this actor allowed to perform this action , optionally against this particular resource ? \n \n Actors are described above . \n An action is a string describing the action the actor would like to perform. A full list is provided below - examples include view-table and execute-sql . \n A resource is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as permissions-debug , are not associated with a particular resource. \n Datasette's built-in view permissions ( view-database , view-table etc) default to allow - unless you configure additional permission rules unauthenticated users will be allowed to access content. \n Permissions with potentially harmful effects should default to deny . Plugin authors should account for this when designing new plugins - for example, the datasette-upload-csvs plugin defaults to deny so that installations don't accidentally allow unauthenticated users to create new tables by uploading a CSV file.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-upload-csvs\", \"label\": \"datasette-upload-csvs\"}]"} {"id": "authentication:authentication-permissions-metadata", "page": "authentication", "ref": "authentication-permissions-metadata", "title": "Configuring permissions in metadata.json", "content": "You can limit who is allowed to view different parts of your Datasette instance using \"allow\" keys in your Metadata configuration. \n You can control the following: \n \n \n Access to the entire Datasette instance \n \n \n Access to specific databases \n \n \n Access to specific tables and views \n \n \n Access to specific Canned queries \n \n \n If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:permissions-plugins", "page": "authentication", "ref": "permissions-plugins", "title": "Checking permissions in plugins", "content": "Datasette plugins can check if an actor has permission to perform an action using the datasette.permission_allowed(...) method. \n Datasette core performs a number of permission checks, documented below . Plugins can implement the permission_allowed(datasette, actor, action, resource) plugin hook to participate in decisions about whether an actor should be able to perform a specified action.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:authentication-actor-matches-allow", "page": "authentication", "ref": "authentication-actor-matches-allow", "title": "actor_matches_allow()", "content": "Plugins that wish to implement this same \"allow\" block permissions scheme can take advantage of the datasette.utils.actor_matches_allow(actor, allow) function: \n from datasette.utils import actor_matches_allow\n\nactor_matches_allow({\"id\": \"root\"}, {\"id\": \"*\"})\n# returns True \n The currently authenticated actor is made available to plugins as request.actor .", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:permissionsdebugview", "page": "authentication", "ref": "permissionsdebugview", "title": "The permissions debug tool", "content": "The debug tool at /-/permissions is only available to the authenticated root user (or any actor granted the permissions-debug action according to a plugin). \n It shows the thirty most recent permission checks that have been carried out by the Datasette instance. \n This is designed to help administrators and plugin authors understand exactly how permission checks are being carried out, in order to effectively configure Datasette's permission system.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:authentication-ds-actor", "page": "authentication", "ref": "authentication-ds-actor", "title": "The ds_actor cookie", "content": "Datasette includes a default authentication plugin which looks for a signed ds_actor cookie containing a JSON actor dictionary. This is how the root actor mechanism works. \n Authentication plugins can set signed ds_actor cookies themselves like so: \n response = Response.redirect(\"/\")\nresponse.set_cookie(\n \"ds_actor\",\n datasette.sign({\"a\": {\"id\": \"cleopaws\"}}, \"actor\"),\n) \n Note that you need to pass \"actor\" as the namespace to .sign(value, namespace=\"default\") . \n The shape of data encoded in the cookie is as follows: \n {\n \"a\": {... actor ...}\n}", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:id1", "page": "authentication", "ref": "id1", "title": "Built-in permissions", "content": "This section lists all of the permission checks that are carried out by Datasette core, along with the resource if it was passed.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "binary_data:binary-linking", "page": "binary_data", "ref": "binary-linking", "title": "Linking to binary downloads", "content": "The .blob output format is used to return binary data. It requires a _blob_column= query string argument specifying which BLOB column should be downloaded, for example: \n https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data \n This output format can also be used to return binary data from an arbitrary SQL query. Since such queries do not specify an exact row, an additional ?_blob_hash= parameter can be used to specify the SHA-256 hash of the value that is being linked to. \n Consider the query select data from binary_data - demonstrated here . \n That page links to the binary value downloads. Those links look like this: \n https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d \n These .blob links are also returned in the .csv exports Datasette provides for binary tables and queries, since the CSV format does not have a mechanism for representing binary data.", "breadcrumbs": "[\"Binary data\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data\", \"label\": \"https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data\"}, {\"href\": \"https://latest.datasette.io/fixtures?sql=select+data+from+binary_data\", \"label\": \"demonstrated here\"}, {\"href\": \"https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d\", \"label\": \"https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d\"}]"} {"id": "binary_data:binary-plugins", "page": "binary_data", "ref": "binary-plugins", "title": "Binary plugins", "content": "Several Datasette plugins are available that change the way Datasette treats binary data. \n \n \n datasette-render-binary modifies Datasette's default interface to show an automatic guess at what type of binary data is being stored, along with a visual representation of the binary value that displays ASCII strings directly in the interface. \n \n \n datasette-render-images detects common image formats and renders them as images directly in the Datasette interface. \n \n \n datasette-media allows Datasette interfaces to be configured to serve binary files from configured SQL queries, and includes the ability to resize images directly before serving them.", "breadcrumbs": "[\"Binary data\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-render-binary\", \"label\": \"datasette-render-binary\"}, {\"href\": \"https://github.com/simonw/datasette-render-images\", \"label\": \"datasette-render-images\"}, {\"href\": \"https://github.com/simonw/datasette-media\", \"label\": \"datasette-media\"}]"} {"id": "cli-reference:cli-datasette-get", "page": "cli-reference", "ref": "cli-datasette-get", "title": "datasette --get", "content": "The --get option to datasette serve (or just datasette ) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server. \n This means that all of Datasette's functionality can be accessed directly from the command-line. \n For example: \n $ datasette --get '/-/versions.json' | jq .\n{\n \"python\": {\n \"version\": \"3.8.5\",\n \"full\": \"3.8.5 (default, Jul 21 2020, 10:48:26) \\n[Clang 11.0.3 (clang-1103.0.32.62)]\"\n },\n \"datasette\": {\n \"version\": \"0.46+15.g222a84a.dirty\"\n },\n \"asgi\": \"3.0\",\n \"uvicorn\": \"0.11.8\",\n \"sqlite\": {\n \"version\": \"3.32.3\",\n \"fts_versions\": [\n \"FTS5\",\n \"FTS4\",\n \"FTS3\"\n ],\n \"extensions\": {\n \"json1\": null\n },\n \"compile_options\": [\n \"COMPILER=clang-11.0.3\",\n \"ENABLE_COLUMN_METADATA\",\n \"ENABLE_FTS3\",\n \"ENABLE_FTS3_PARENTHESIS\",\n \"ENABLE_FTS4\",\n \"ENABLE_FTS5\",\n \"ENABLE_GEOPOLY\",\n \"ENABLE_JSON1\",\n \"ENABLE_PREUPDATE_HOOK\",\n \"ENABLE_RTREE\",\n \"ENABLE_SESSION\",\n \"MAX_VARIABLE_NUMBER=250000\",\n \"THREADSAFE=1\"\n ]\n }\n} \n The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error. \n This lets you use datasette --get / to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.", "breadcrumbs": "[\"CLI reference\", \"datasette serve\"]", "references": "[]"} {"id": "cli-reference:cli-help-serve-help-settings", "page": "cli-reference", "ref": "cli-help-serve-help-settings", "title": "datasette serve --help-settings", "content": "This command outputs all of the available Datasette settings . \n These can be passed to datasette serve using datasette serve --setting name value . \n [[[cog\nhelp([\"--help-settings\"]) \n ]]] \n Settings:\n default_page_size Default page size for the table view\n (default=100)\n max_returned_rows Maximum rows that can be returned from a table or\n custom query (default=1000)\n num_sql_threads Number of threads in the thread pool for\n executing SQLite queries (default=3)\n sql_time_limit_ms Time limit for a SQL query in milliseconds\n (default=1000)\n default_facet_size Number of values to return for requested facets\n (default=30)\n facet_time_limit_ms Time limit for calculating a requested facet\n (default=200)\n facet_suggest_time_limit_ms Time limit for calculating a suggested facet\n (default=50)\n allow_facet Allow users to specify columns to facet using\n ?_facet= parameter (default=True)\n default_allow_sql Allow anyone to run arbitrary SQL queries\n (default=True)\n allow_download Allow users to download the original SQLite\n database files (default=True)\n suggest_facets Calculate and display suggested facets\n (default=True)\n default_cache_ttl Default HTTP cache TTL (used in Cache-Control:\n max-age= header) (default=5)\n cache_size_kb SQLite cache size in KB (0 == use SQLite default)\n (default=0)\n allow_csv_stream Allow .csv?_stream=1 to download all rows\n (ignoring max_returned_rows) (default=True)\n max_csv_mb Maximum size allowed for CSV export in MB - set 0\n to disable this limit (default=100)\n truncate_cells_html Truncate cells longer than this in HTML table\n view - set 0 to disable (default=2048)\n force_https_urls Force URLs in API output to always use https://\n protocol (default=False)\n template_debug Allow display of template debug information with\n ?_context=1 (default=False)\n trace_debug Allow display of SQL trace debug information with\n ?_trace=1 (default=False)\n base_url Datasette URLs should use this base path\n (default=/) \n [[[end]]]", "breadcrumbs": "[\"CLI reference\", \"datasette serve\"]", "references": "[]"} {"id": "cli-reference:cli-help-help", "page": "cli-reference", "ref": "cli-help-help", "title": "datasette --help", "content": "Running datasette --help shows a list of all of the available commands. \n [[[cog\nhelp([\"--help\"]) \n ]]] \n Usage: datasette [OPTIONS] COMMAND [ARGS]...\n\n Datasette is an open source multi-tool for exploring and publishing data\n\n About Datasette: https://datasette.io/\n Full documentation: https://docs.datasette.io/\n\nOptions:\n --version Show the version and exit.\n --help Show this message and exit.\n\nCommands:\n serve* Serve up specified SQLite database files with a web UI\n inspect Generate JSON summary of provided database files\n install Install plugins and packages from PyPI into the same...\n package Package SQLite files into a Datasette Docker container\n plugins List currently installed plugins\n publish Publish specified SQLite database files to the internet along...\n uninstall Uninstall plugins and Python packages from the Datasette... \n [[[end]]] \n Additional commands added by plugins that use the register_commands(cli) hook will be listed here as well.", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-serve-help", "page": "cli-reference", "ref": "cli-help-serve-help", "title": "datasette serve", "content": "This command starts the Datasette web application running on your machine: \n datasette serve mydatabase.db \n Or since this is the default command you can run this instead: \n datasette mydatabase.db \n Once started you can access it at http://localhost:8001 \n [[[cog\nhelp([\"serve\", \"--help\"]) \n ]]] \n Usage: datasette serve [OPTIONS] [FILES]...\n\n Serve up specified SQLite database files with a web UI\n\nOptions:\n -i, --immutable PATH Database files to open in immutable mode\n -h, --host TEXT Host for server. Defaults to 127.0.0.1 which\n means only connections from the local machine\n will be allowed. Use 0.0.0.0 to listen to all\n IPs and allow access from other machines.\n -p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to\n automatically assign an available port.\n [0<=x<=65535]\n --uds TEXT Bind to a Unix domain socket\n --reload Automatically reload if code or metadata\n change detected - useful for development\n --cors Enable CORS by serving Access-Control-Allow-\n Origin: *\n --load-extension PATH:ENTRYPOINT?\n Path to a SQLite extension to load, and\n optional entrypoint\n --inspect-file TEXT Path to JSON file created using \"datasette\n inspect\"\n -m, --metadata FILENAME Path to JSON/YAML file containing\n license/source metadata\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --memory Make /_memory database available\n --config CONFIG Deprecated: set config option using\n configname:value. Use --setting instead.\n --setting SETTING... Setting, see\n docs.datasette.io/en/stable/settings.html\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --root Output URL that sets a cookie authenticating\n the root user\n --get TEXT Run an HTTP GET request against this path,\n print results and exit\n --version-note TEXT Additional note to show on /-/versions\n --help-settings Show available settings\n --pdb Launch debugger on any errors\n -o, --open Open Datasette in your web browser\n --create Create database files if they do not exist\n --crossdb Enable cross-database joins using the /_memory\n database\n --nolock Ignore locking, open locked files in read-only\n mode\n --ssl-keyfile TEXT SSL key file\n --ssl-certfile TEXT SSL certificate file\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-plugins-help", "page": "cli-reference", "ref": "cli-help-plugins-help", "title": "datasette plugins", "content": "Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use. \n [[[cog\nhelp([\"plugins\", \"--help\"]) \n ]]] \n Usage: datasette plugins [OPTIONS]\n\n List currently installed plugins\n\nOptions:\n --all Include built-in default plugins\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --help Show this message and exit. \n [[[end]]] \n Example output: \n [\n {\n \"name\": \"datasette-geojson\",\n \"static\": false,\n \"templates\": false,\n \"version\": \"0.3.1\",\n \"hooks\": [\n \"register_output_renderer\"\n ]\n },\n {\n \"name\": \"datasette-geojson-map\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.4.0\",\n \"hooks\": [\n \"extra_body_script\",\n \"extra_css_urls\",\n \"extra_js_urls\"\n ]\n },\n {\n \"name\": \"datasette-leaflet\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.2.2\",\n \"hooks\": [\n \"extra_body_script\",\n \"extra_template_vars\"\n ]\n }\n]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-install-help", "page": "cli-reference", "ref": "cli-help-install-help", "title": "datasette install", "content": "Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette. \n This command: \n datasette install datasette-cluster-map \n Would install the datasette-cluster-map plugin. \n [[[cog\nhelp([\"install\", \"--help\"]) \n ]]] \n Usage: datasette install [OPTIONS] PACKAGES...\n\n Install plugins and packages from PyPI into the same environment as Datasette\n\nOptions:\n -U, --upgrade Upgrade packages to latest version\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[{\"href\": \"https://datasette.io/plugins/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}]"} {"id": "cli-reference:cli-help-uninstall-help", "page": "cli-reference", "ref": "cli-help-uninstall-help", "title": "datasette uninstall", "content": "Uninstall one or more plugins. \n [[[cog\nhelp([\"uninstall\", \"--help\"]) \n ]]] \n Usage: datasette uninstall [OPTIONS] PACKAGES...\n\n Uninstall plugins and Python packages from the Datasette environment\n\nOptions:\n -y, --yes Don't ask for confirmation\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-publish-help", "page": "cli-reference", "ref": "cli-help-publish-help", "title": "datasette publish", "content": "Shows a list of available deployment targets for publishing data with Datasette. \n Additional deployment targets can be added by plugins that use the publish_subcommand(publish) hook. \n [[[cog\nhelp([\"publish\", \"--help\"]) \n ]]] \n Usage: datasette publish [OPTIONS] COMMAND [ARGS]...\n\n Publish specified SQLite database files to the internet along with a\n Datasette-powered interface and API\n\nOptions:\n --help Show this message and exit.\n\nCommands:\n cloudrun Publish databases to Datasette running on Cloud Run\n heroku Publish databases to Datasette running on Heroku \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-publish-cloudrun-help", "page": "cli-reference", "ref": "cli-help-publish-cloudrun-help", "title": "datasette publish cloudrun", "content": "See Publishing to Google Cloud Run . \n [[[cog\nhelp([\"publish\", \"cloudrun\", \"--help\"]) \n ]]] \n Usage: datasette publish cloudrun [OPTIONS] [FILES]...\n\n Publish databases to Datasette running on Cloud Run\n\nOptions:\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g.\n main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --plugin-secret ...\n Secrets to pass to plugins, e.g. --plugin-\n secret datasette-auth-github client_id xxx\n --version-note TEXT Additional note to show on /-/versions\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --title TEXT Title for metadata\n --license TEXT License label for metadata\n --license_url TEXT License URL for metadata\n --source TEXT Source label for metadata\n --source_url TEXT Source URL for metadata\n --about TEXT About label for metadata\n --about_url TEXT About URL for metadata\n -n, --name TEXT Application name to use when building\n --service TEXT Cloud Run service to deploy (or over-write)\n --spatialite Enable SpatialLite extension\n --show-files Output the generated Dockerfile and\n metadata.json\n --memory TEXT Memory to allocate in Cloud Run, e.g. 1Gi\n --cpu [1|2|4] Number of vCPUs to allocate in Cloud Run\n --timeout INTEGER Build timeout in seconds\n --apt-get-install TEXT Additional packages to apt-get install\n --max-instances INTEGER Maximum Cloud Run instances\n --min-instances INTEGER Minimum Cloud Run instances\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-publish-heroku-help", "page": "cli-reference", "ref": "cli-help-publish-heroku-help", "title": "datasette publish heroku", "content": "See Publishing to Heroku . \n [[[cog\nhelp([\"publish\", \"heroku\", \"--help\"]) \n ]]] \n Usage: datasette publish heroku [OPTIONS] [FILES]...\n\n Publish databases to Datasette running on Heroku\n\nOptions:\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g.\n main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --plugin-secret ...\n Secrets to pass to plugins, e.g. --plugin-\n secret datasette-auth-github client_id xxx\n --version-note TEXT Additional note to show on /-/versions\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --title TEXT Title for metadata\n --license TEXT License label for metadata\n --license_url TEXT License URL for metadata\n --source TEXT Source label for metadata\n --source_url TEXT Source URL for metadata\n --about TEXT About label for metadata\n --about_url TEXT About URL for metadata\n -n, --name TEXT Application name to use when deploying\n --tar TEXT --tar option to pass to Heroku, e.g.\n --tar=/usr/local/bin/gtar\n --generate-dir DIRECTORY Output generated application files and stop\n without deploying\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-package-help", "page": "cli-reference", "ref": "cli-help-package-help", "title": "datasette package", "content": "Package SQLite files into a Datasette Docker container, see datasette package . \n [[[cog\nhelp([\"package\", \"--help\"]) \n ]]] \n Usage: datasette package [OPTIONS] FILES...\n\n Package SQLite files into a Datasette Docker container\n\nOptions:\n -t, --tag TEXT Name for the resulting Docker container, can\n optionally use name:tag format\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g. main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --spatialite Enable SpatialLite extension\n --version-note TEXT Additional note to show on /-/versions\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n -p, --port INTEGER RANGE Port to run the server on, defaults to 8001\n [1<=x<=65535]\n --title TEXT Title for metadata\n --license TEXT License label for metadata\n --license_url TEXT License URL for metadata\n --source TEXT Source label for metadata\n --source_url TEXT Source URL for metadata\n --about TEXT About label for metadata\n --about_url TEXT About URL for metadata\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-inspect-help", "page": "cli-reference", "ref": "cli-help-inspect-help", "title": "datasette inspect", "content": "Outputs JSON representing introspected data about one or more SQLite database files. \n If you are opening an immutable database, you can pass this file to the --inspect-data option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running: \n datasette inspect mydatabase.db > inspect-data.json\ndatasette serve -i mydatabase.db --inspect-file inspect-data.json \n This performance optimization is used automatically by some of the datasette publish commands. You are unlikely to need to apply this optimization manually. \n [[[cog\nhelp([\"inspect\", \"--help\"]) \n ]]] \n Usage: datasette inspect [OPTIONS] [FILES]...\n\n Generate JSON summary of provided database files\n\n This can then be passed to \"datasette --inspect-file\" to speed up count\n operations against immutable database files.\n\nOptions:\n --inspect-file TEXT\n --load-extension PATH:ENTRYPOINT?\n Path to a SQLite extension to load, and\n optional entrypoint\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "csv_export:csv-export-url-parameters", "page": "csv_export", "ref": "csv-export-url-parameters", "title": "URL parameters", "content": "The following options can be used to customize the CSVs returned by Datasette. \n \n \n ?_header=off \n \n This removes the first row of the CSV file specifying the headings - only the row data will be returned. \n \n \n \n ?_stream=on \n \n Stream all matching records, not just the first page of results. See below. \n \n \n \n ?_dl=on \n \n Causes Datasette to return a content-disposition: attachment; filename=\"filename.csv\" header.", "breadcrumbs": "[\"CSV export\"]", "references": "[]"} {"id": "csv_export:streaming-all-records", "page": "csv_export", "ref": "streaming-all-records", "title": "Streaming all records", "content": "The stream all rows option is designed to be as efficient as possible -\n under the hood it takes advantage of Python 3 asyncio capabilities and\n Datasette's efficient pagination to stream back the full\n CSV file. \n Since databases can get pretty large, by default this option is capped at 100MB -\n if a table returns more than 100MB of data the last line of the CSV will be a\n truncation error message. \n You can increase or remove this limit using the max_csv_mb config\n setting. You can also disable the CSV export feature entirely using\n allow_csv_stream .", "breadcrumbs": "[\"CSV export\"]", "references": "[]"} {"id": "changelog:csv-export", "page": "changelog", "ref": "csv-export", "title": "CSV export", "content": "Any Datasette table, view or custom SQL query can now be exported as CSV. \n \n Check out the CSV export documentation for more details, or\n try the feature out on\n https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies \n If your table has more than max_returned_rows (default 1,000)\n Datasette provides the option to stream all rows . This option takes advantage\n of async Python and Datasette's efficient pagination to\n iterate through the entire matching result set and stream it back as a\n downloadable CSV file.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies\", \"label\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies\"}]"} {"id": "changelog:foreign-key-expansions", "page": "changelog", "ref": "foreign-key-expansions", "title": "Foreign key expansions", "content": "When Datasette detects a foreign key reference it attempts to resolve a label\n for that reference (automatically or using the Specifying the label column for a table metadata\n option) so it can display a link to the associated row. \n This expansion is now also available for JSON and CSV representations of the\n table, using the new _labels=on query string option. See\n Expanding foreign key references for more details.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[]"} {"id": "changelog:new-configuration-settings", "page": "changelog", "ref": "new-configuration-settings", "title": "New configuration settings", "content": "Datasette's Settings now also supports boolean settings. A number of new\n configuration options have been added: \n \n \n num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3. \n \n \n allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on. \n \n \n suggest_facets - should Datasette suggest facets? Defaults to on. \n \n \n allow_download - should users be allowed to download the entire SQLite database? Defaults to on. \n \n \n allow_sql - should users be allowed to execute custom SQL queries? Defaults to on. \n \n \n default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0. \n \n \n cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB. \n \n \n allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on. \n \n \n max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://www.sqlite.org/pragma.html#pragma_cache_size\", \"label\": \"per-connection cache\"}]"} {"id": "changelog:control-http-caching-with-ttl", "page": "changelog", "ref": "control-http-caching-with-ttl", "title": "Control HTTP caching with ?_ttl=", "content": "You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter. \n You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely. \n Consider for example this query which returns a randomly selected member of the Avengers: \n select * from [avengers/avengers] order by random() limit 1 \n If you hit the following page repeatedly you will get the same result, due to HTTP caching: \n /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1 \n By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time: \n /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1\", \"label\": \"/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0\", \"label\": \"/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0\"}]"} {"id": "changelog:improved-support-for-spatialite", "page": "changelog", "ref": "improved-support-for-spatialite", "title": "Improved support for SpatiaLite", "content": "The SpatiaLite module \n for SQLite adds robust geospatial features to the database. \n Getting SpatiaLite working can be tricky, especially if you want to use the most\n recent alpha version (with support for K-nearest neighbor). \n Datasette now includes extensive documentation on SpatiaLite , and thanks to Ravi Kotecha our GitHub\n repo includes a Dockerfile that can build\n the latest SpatiaLite and configure it for use with Datasette. \n The datasette publish and datasette package commands now accept a new\n --spatialite argument which causes them to install and configure SpatiaLite\n as part of the container they deploy.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://www.gaia-gis.it/fossil/libspatialite/index\", \"label\": \"SpatiaLite module\"}, {\"href\": \"https://github.com/r4vi\", \"label\": \"Ravi Kotecha\"}, {\"href\": \"https://github.com/simonw/datasette/blob/master/Dockerfile\", \"label\": \"Dockerfile\"}]"} {"id": "changelog:latest-datasette-io", "page": "changelog", "ref": "latest-datasette-io", "title": "latest.datasette.io", "content": "Every commit to Datasette master is now automatically deployed by Travis CI to\n https://latest.datasette.io/ - ensuring there is always a live demo of the\n latest version of the software. \n The demo uses the fixtures from our\n unit tests, ensuring it demonstrates the same range of functionality that is\n covered by the tests. \n You can see how the deployment mechanism works in our .travis.yml file.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://latest.datasette.io/\", \"label\": \"https://latest.datasette.io/\"}, {\"href\": \"https://github.com/simonw/datasette/blob/master/tests/fixtures.py\", \"label\": \"the fixtures\"}, {\"href\": \"https://github.com/simonw/datasette/blob/master/.travis.yml\", \"label\": \".travis.yml\"}]"} {"id": "changelog:miscellaneous", "page": "changelog", "ref": "miscellaneous", "title": "Miscellaneous", "content": "Got JSON data in one of your columns? Use the new ?_json=COLNAME argument\n to tell Datasette to return that JSON value directly rather than encoding it\n as a string. \n \n \n If you just want an array of the first value of each row, use the new\n ?_shape=arrayfirst option - example .", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures.json?sql=select+neighborhood+from+facetable+order+by+pk+limit+101&_shape=arrayfirst\", \"label\": \"example\"}]"} {"id": "changelog:v0-28-databases-that-change", "page": "changelog", "ref": "v0-28-databases-that-change", "title": "Supporting databases that change", "content": "From the beginning of the project, Datasette has been designed with read-only databases in mind. If a database is guaranteed not to change it opens up all kinds of interesting opportunities - from taking advantage of SQLite immutable mode and HTTP caching to bundling static copies of the database directly in a Docker container. The interesting ideas in Datasette explores this idea in detail. \n As my goals for the project have developed, I realized that read-only databases are no longer the right default. SQLite actually supports concurrent access very well provided only one thread attempts to write to a database at a time, and I keep encountering sensible use-cases for running Datasette on top of a database that is processing inserts and updates. \n So, as-of version 0.28 Datasette no longer assumes that a database file will not change. It is now safe to point Datasette at a SQLite database which is being updated by another process. \n Making this change was a lot of work - see tracking tickets #418 , #419 and #420 . It required new thinking around how Datasette should calculate table counts (an expensive operation against a large, changing database) and also meant reconsidering the \"content hash\" URLs Datasette has used in the past to optimize the performance of HTTP caches. \n Datasette can still run against immutable files and gains numerous performance benefits from doing so, but this is no longer the default behaviour. Take a look at the new Performance and caching documentation section for details on how to make the most of Datasette against data that you know will be staying read-only and immutable.", "breadcrumbs": "[\"Changelog\", \"0.28 (2019-05-19)\"]", "references": "[{\"href\": \"https://simonwillison.net/2018/Oct/4/datasette-ideas/\", \"label\": \"The interesting ideas in Datasette\"}, {\"href\": \"https://github.com/simonw/datasette/issues/418\", \"label\": \"#418\"}, {\"href\": \"https://github.com/simonw/datasette/issues/419\", \"label\": \"#419\"}, {\"href\": \"https://github.com/simonw/datasette/issues/420\", \"label\": \"#420\"}]"} {"id": "changelog:v0-28-faceting", "page": "changelog", "ref": "v0-28-faceting", "title": "Faceting improvements, and faceting plugins", "content": "Datasette Facets provide an intuitive way to quickly summarize and interact with data. Previously the only supported faceting technique was column faceting, but 0.28 introduces two powerful new capabilities: facet-by-JSON-array and the ability to define further facet types using plugins. \n Facet by array ( #359 ) is only available if your SQLite installation provides the json1 extension. Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns - useful for modelling things like tags without needing to break them out into a new table. See Facet by JSON array for more. \n The new register_facet_classes() plugin hook ( #445 ) can be used to register additional custom facet classes. Each facet class should provide two methods: suggest() which suggests facet selections that might be appropriate for a provided SQL query, and facet_results() which executes a facet operation and returns results. Datasette's own faceting implementations have been refactored to use the same API as these plugins.", "breadcrumbs": "[\"Changelog\", \"0.28 (2019-05-19)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/359\", \"label\": \"#359\"}, {\"href\": \"https://github.com/simonw/datasette/pull/445\", \"label\": \"#445\"}]"} {"id": "changelog:v0-28-publish-cloudrun", "page": "changelog", "ref": "v0-28-publish-cloudrun", "title": "datasette publish cloudrun", "content": "Google Cloud Run is a brand new serverless hosting platform from Google, which allows you to build a Docker container which will run only when HTTP traffic is received and will shut down (and hence cost you nothing) the rest of the time. It's similar to Zeit's Now v1 Docker hosting platform which sadly is no longer accepting signups from new users. \n The new datasette publish cloudrun command was contributed by Romain Primet ( #434 ) and publishes selected databases to a new Datasette instance running on Google Cloud Run. \n See Publishing to Google Cloud Run for full documentation.", "breadcrumbs": "[\"Changelog\", \"0.28 (2019-05-19)\"]", "references": "[{\"href\": \"https://cloud.google.com/run/\", \"label\": \"Google Cloud Run\"}, {\"href\": \"https://hyperion.alpha.spectrum.chat/zeit/now/cannot-create-now-v1-deployments~d206a0d4-5835-4af5-bb5c-a17f0171fb25?m=MTU0Njk2NzgwODM3OA==\", \"label\": \"no longer accepting signups\"}, {\"href\": \"https://github.com/simonw/datasette/pull/434\", \"label\": \"#434\"}]"} {"id": "changelog:v0-28-register-output-renderer", "page": "changelog", "ref": "v0-28-register-output-renderer", "title": "register_output_renderer plugins", "content": "Russ Garrett implemented a new Datasette plugin hook called register_output_renderer ( #441 ) which allows plugins to create additional output renderers in addition to Datasette's default .json and .csv . \n Russ's in-development datasette-geo plugin includes an example of this hook being used to output .geojson automatically converted from SpatiaLite.", "breadcrumbs": "[\"Changelog\", \"0.28 (2019-05-19)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/pull/441\", \"label\": \"#441\"}, {\"href\": \"https://github.com/russss/datasette-geo\", \"label\": \"datasette-geo\"}, {\"href\": \"https://github.com/russss/datasette-geo/blob/d4cecc020848bbde91e9e17bf352f7c70bc3dccf/datasette_plugin_geo/geojson.py\", \"label\": \"an example\"}]"} {"id": "changelog:v0-28-medium-changes", "page": "changelog", "ref": "v0-28-medium-changes", "title": "Medium changes", "content": "Datasette now conforms to the Black coding style ( #449 ) - and has a unit test to enforce this in the future \n \n \n \n \n New Special table arguments : \n \n \n \n ?columnname__in=value1,value2,value3 filter for executing SQL IN queries against a table, see Table arguments ( #433 ) \n \n \n ?columnname__date=yyyy-mm-dd filter which returns rows where the spoecified datetime column falls on the specified date ( 583b22a ) \n \n \n ?tags__arraycontains=tag filter which acts against a JSON array contained in a column ( 78e45ea ) \n \n \n ?_where=sql-fragment filter for the table view ( #429 ) \n \n \n ?_fts_table=mytable and ?_fts_pk=mycolumn query string options can be used to specify which FTS table to use for a search query - see Configuring full-text search for a table or view ( #428 ) \n \n \n \n \n \n \n \n You can now pass the same table filter multiple times - for example, ?content__not=world&content__not=hello will return all rows where the content column is neither hello or world ( #288 ) \n \n \n You can now specify about and about_url metadata (in addition to source and license ) linking to further information about a project - see Source, license and about \n \n \n New ?_trace=1 parameter now adds debug information showing every SQL query that was executed while constructing the page ( #435 ) \n \n \n datasette inspect now just calculates table counts, and does not introspect other database metadata ( #462 ) \n \n \n Removed /-/inspect page entirely - this will be replaced by something similar in the future, see #465 \n \n \n Datasette can now run against an in-memory SQLite database. You can do this by starting it without passing any files or by using the new --memory option to datasette serve . This can be useful for experimenting with SQLite queries that do not access any data, such as SELECT 1+1 or SELECT sqlite_version() .", "breadcrumbs": "[\"Changelog\", \"0.28 (2019-05-19)\"]", "references": "[{\"href\": \"https://github.com/python/black\", \"label\": \"Black coding style\"}, {\"href\": \"https://github.com/simonw/datasette/pull/449\", \"label\": \"#449\"}, {\"href\": \"https://github.com/simonw/datasette/issues/433\", \"label\": \"#433\"}, {\"href\": \"https://github.com/simonw/datasette/commit/583b22aa28e26c318de0189312350ab2688c90b1\", \"label\": \"583b22a\"}, {\"href\": \"https://github.com/simonw/datasette/commit/78e45ead4d771007c57b307edf8fc920101f8733\", \"label\": \"78e45ea\"}, {\"href\": \"https://github.com/simonw/datasette/issues/429\", \"label\": \"#429\"}, {\"href\": \"https://github.com/simonw/datasette/issues/428\", \"label\": \"#428\"}, {\"href\": \"https://github.com/simonw/datasette/issues/288\", \"label\": \"#288\"}, {\"href\": \"https://github.com/simonw/datasette/issues/435\", \"label\": \"#435\"}, {\"href\": \"https://github.com/simonw/datasette/issues/462\", \"label\": \"#462\"}, {\"href\": \"https://github.com/simonw/datasette/issues/465\", \"label\": \"#465\"}]"} {"id": "changelog:id83", "page": "changelog", "ref": "id83", "title": "Small changes", "content": "We now show the size of the database file next to the download link ( #172 ) \n \n \n New /-/databases introspection page shows currently connected databases ( #470 ) \n \n \n Binary data is no longer displayed on the table and row pages ( #442 - thanks, Russ Garrett) \n \n \n New show/hide SQL links on custom query pages ( #415 ) \n \n \n The extra_body_script plugin hook now accepts an optional view_name argument ( #443 - thanks, Russ Garrett) \n \n \n Bumped Jinja2 dependency to 2.10.1 ( #426 ) \n \n \n All table filters are now documented, and documentation is enforced via unit tests ( 2c19a27 ) \n \n \n New project guideline: master should stay shippable at all times! ( 31f36e1 ) \n \n \n Fixed a bug where sqlite_timelimit() occasionally failed to clean up after itself ( bac4e01 ) \n \n \n We no longer load additional plugins when executing pytest ( #438 ) \n \n \n Homepage now links to database views if there are less than five tables in a database ( #373 ) \n \n \n The --cors option is now respected by error pages ( #453 ) \n \n \n datasette publish heroku now uses the --include-vcs-ignore option, which means it works under Travis CI ( #407 ) \n \n \n datasette publish heroku now publishes using Python 3.6.8 ( 666c374 ) \n \n \n Renamed datasette publish now to datasette publish nowv1 ( #472 ) \n \n \n datasette publish nowv1 now accepts multiple --alias parameters ( 09ef305 ) \n \n \n Removed the datasette skeleton command ( #476 ) \n \n \n The documentation on how to build the documentation now recommends sphinx-autobuild", "breadcrumbs": "[\"Changelog\", \"0.28 (2019-05-19)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/172\", \"label\": \"#172\"}, {\"href\": \"https://github.com/simonw/datasette/issues/470\", \"label\": \"#470\"}, {\"href\": \"https://github.com/simonw/datasette/pull/442\", \"label\": \"#442\"}, {\"href\": \"https://github.com/simonw/datasette/issues/415\", \"label\": \"#415\"}, {\"href\": \"https://github.com/simonw/datasette/pull/443\", \"label\": \"#443\"}, {\"href\": \"https://github.com/simonw/datasette/pull/426\", \"label\": \"#426\"}, {\"href\": \"https://github.com/simonw/datasette/commit/2c19a27d15a913e5f3dd443f04067169a6f24634\", \"label\": \"2c19a27\"}, {\"href\": \"https://github.com/simonw/datasette/commit/31f36e1b97ccc3f4387c80698d018a69798b6228\", \"label\": \"31f36e1\"}, {\"href\": \"https://github.com/simonw/datasette/commit/bac4e01f40ae7bd19d1eab1fb9349452c18de8f5\", \"label\": \"bac4e01\"}, {\"href\": \"https://github.com/simonw/datasette/issues/438\", \"label\": \"#438\"}, {\"href\": \"https://github.com/simonw/datasette/issues/373\", \"label\": \"#373\"}, {\"href\": \"https://github.com/simonw/datasette/issues/453\", \"label\": \"#453\"}, {\"href\": \"https://github.com/simonw/datasette/pull/407\", \"label\": \"#407\"}, {\"href\": \"https://github.com/simonw/datasette/commit/666c37415a898949fae0437099d62a35b1e9c430\", \"label\": \"666c374\"}, {\"href\": \"https://github.com/simonw/datasette/issues/472\", \"label\": \"#472\"}, {\"href\": \"https://github.com/simonw/datasette/commit/09ef305c687399384fe38487c075e8669682deb4\", \"label\": \"09ef305\"}, {\"href\": \"https://github.com/simonw/datasette/issues/476\", \"label\": \"#476\"}]"} {"id": "changelog:asgi", "page": "changelog", "ref": "asgi", "title": "ASGI", "content": "ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic . \n I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down . \n The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://asgi.readthedocs.io/\", \"label\": \"ASGI\"}, {\"href\": \"https://github.com/simonw/datasette/issues/272\", \"label\": \"Port Datasette to ASGI #272\"}, {\"href\": \"https://www.uvicorn.org/\", \"label\": \"Uvicorn\"}, {\"href\": \"https://github.com/huge-success/sanic\", \"label\": \"Sanic\"}, {\"href\": \"https://simonwillison.net/2019/Jun/23/datasette-asgi/\", \"label\": \"Porting Datasette to ASGI, and Turtles all the way down\"}]"} {"id": "changelog:new-plugin-hook-asgi-wrapper", "page": "changelog", "ref": "new-plugin-hook-asgi-wrapper", "title": "New plugin hook: asgi_wrapper", "content": "The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. ( #520 ) \n Two new plugins take advantage of this hook: \n \n \n datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams . \n \n \n datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make fetch() calls to the JSON API provided by your Datasette instance.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/520\", \"label\": \"#520\"}, {\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}, {\"href\": \"https://help.github.com/en/articles/about-organizations\", \"label\": \"organizations\"}, {\"href\": \"https://help.github.com/en/articles/organizing-members-into-teams\", \"label\": \"teams\"}, {\"href\": \"https://github.com/simonw/datasette-cors\", \"label\": \"datasette-cors\"}, {\"href\": \"https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS\", \"label\": \"CORS headers\"}]"} {"id": "changelog:new-plugin-hook-extra-template-vars", "page": "changelog", "ref": "new-plugin-hook-extra-template-vars", "title": "New plugin hook: extra_template_vars", "content": "The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ).", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}, {\"href\": \"https://github.com/simonw/datasette/issues/540\", \"label\": \"#540\"}]"} {"id": "changelog:secret-plugin-configuration-options", "page": "changelog", "ref": "secret-plugin-configuration-options", "title": "Secret plugin configuration options", "content": "Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable: \n {\n \"plugins\": {\n \"datasette-auth-github\": {\n \"client_secret\": {\n \"$env\": \"GITHUB_CLIENT_SECRET\"\n }\n }\n }\n} \n These plugin secrets can be set directly using datasette publish . See Custom metadata and plugins for details. ( #538 and #543 )", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}, {\"href\": \"https://github.com/simonw/datasette/issues/538\", \"label\": \"#538\"}, {\"href\": \"https://github.com/simonw/datasette/issues/543\", \"label\": \"#543\"}]"} {"id": "changelog:facet-by-date", "page": "changelog", "ref": "facet-by-date", "title": "Facet by date", "content": "If a column contains datetime values, Datasette can now facet that column by date. ( #481 )", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/481\", \"label\": \"#481\"}]"} {"id": "changelog:v0-29-medium-changes", "page": "changelog", "ref": "v0-29-medium-changes", "title": "Easier custom templates for table rows", "content": "If you want to customize the display of individual table rows, you can do so using a _table.html template include that looks something like this: \n {% for row in display_rows %}\n
\n

{{ row[\"title\"] }}

\n

{{ row[\"description\"] }}\n

Category: {{ row.display(\"category_id\") }}

\n
\n{% endfor %} \n This is a backwards incompatible change . If you previously had a custom template called _rows_and_columns.html you need to rename it to _table.html . \n See Custom templates for full details.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[]"} {"id": "changelog:through-for-joins-through-many-to-many-tables", "page": "changelog", "ref": "through-for-joins-through-many-to-many-tables", "title": "?_through= for joins through many-to-many tables", "content": "The new ?_through={json} argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example . ( #355 ) \n This feature was added to help support facet by many-to-many , which isn't quite ready yet but will be coming in the next Datasette release.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}\", \"label\": \"an example\"}, {\"href\": \"https://github.com/simonw/datasette/issues/355\", \"label\": \"#355\"}, {\"href\": \"https://github.com/simonw/datasette/issues/551\", \"label\": \"facet by many-to-many\"}]"} {"id": "changelog:small-changes", "page": "changelog", "ref": "small-changes", "title": "Small changes", "content": "Databases published using datasette publish now open in Immutable mode . ( #469 ) \n \n \n ?col__date= now works for columns containing spaces \n \n \n Automatic label detection (for deciding which column to show when linking to a foreign key) has been improved. ( #485 ) \n \n \n Fixed bug where pagination broke when combined with an expanded foreign key. ( #489 ) \n \n \n Contributors can now run pip install -e .[docs] to get all of the dependencies needed to build the documentation, including cd docs && make livehtml support. \n \n \n Datasette's dependencies are now all specified using the ~= match operator. ( #532 ) \n \n \n white-space: pre-wrap now used for table creation SQL. ( #505 ) \n \n \n Full list of commits between 0.28 and 0.29.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/469\", \"label\": \"#469\"}, {\"href\": \"https://github.com/simonw/datasette/issues/485\", \"label\": \"#485\"}, {\"href\": \"https://github.com/simonw/datasette/issues/489\", \"label\": \"#489\"}, {\"href\": \"https://github.com/simonw/datasette/issues/532\", \"label\": \"#532\"}, {\"href\": \"https://github.com/simonw/datasette/issues/505\", \"label\": \"#505\"}, {\"href\": \"https://github.com/simonw/datasette/compare/0.28...0.29\", \"label\": \"Full list of commits\"}]"} {"id": "changelog:authentication", "page": "changelog", "ref": "authentication", "title": "Authentication", "content": "Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through datasette-auth-github . \n 0.44 introduces Authentication and permissions as core Datasette concepts ( #699 ). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example. \n You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new --root command-line option, which outputs a one-time use URL to authenticate as a root actor ( #784 ): \n $ datasette fixtures.db --root\nhttp://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477\nINFO: Started server process [14973]\nINFO: Waiting for application startup.\nINFO: Application startup complete.\nINFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) \n Plugins can implement new ways of authenticating users using the new actor_from_request(datasette, request) hook.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}, {\"href\": \"https://github.com/simonw/datasette/issues/699\", \"label\": \"#699\"}, {\"href\": \"https://github.com/simonw/datasette/issues/784\", \"label\": \"#784\"}]"} {"id": "changelog:permissions", "page": "changelog", "ref": "permissions", "title": "Permissions", "content": "Datasette also now has a built-in concept of Permissions . The permissions system answers the following question: \n \n Is this actor allowed to perform this action , optionally against this particular resource ? \n \n You can use the new \"allow\" block syntax in metadata.json (or metadata.yaml ) to set required permissions at the instance, database, table or canned query level. For example, to restrict access to the fixtures.db database to the \"root\" user: \n {\n \"databases\": {\n \"fixtures\": {\n \"allow\": {\n \"id\" \"root\"\n }\n }\n }\n} \n See Defining permissions with \"allow\" blocks for more details. \n Plugins can implement their own custom permission checks using the new permission_allowed(datasette, actor, action, resource) hook. \n A new debug page at /-/permissions shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the permissions-debug permission. ( #788 )", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/788\", \"label\": \"#788\"}]"} {"id": "changelog:writable-canned-queries", "page": "changelog", "ref": "writable-canned-queries", "title": "Writable canned queries", "content": "Datasette's Canned queries feature lets you define SQL queries in metadata.json which can then be executed by users visiting a specific URL. https://latest.datasette.io/fixtures/neighborhood_search for example. \n Canned queries were previously restricted to SELECT , but Datasette 0.44 introduces the ability for canned queries to execute INSERT or UPDATE queries as well, using the new \"write\": true property ( #800 ): \n {\n \"databases\": {\n \"dogs\": {\n \"queries\": {\n \"add_name\": {\n \"sql\": \"INSERT INTO names (name) VALUES (:name)\",\n \"write\": true\n }\n }\n }\n }\n} \n See Writable canned queries for more details.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/neighborhood_search\", \"label\": \"https://latest.datasette.io/fixtures/neighborhood_search\"}, {\"href\": \"https://github.com/simonw/datasette/issues/800\", \"label\": \"#800\"}]"} {"id": "changelog:flash-messages", "page": "changelog", "ref": "flash-messages", "title": "Flash messages", "content": "Writable canned queries needed a mechanism to let the user know that the query has been successfully executed. The new flash messaging system ( #790 ) allows messages to persist in signed cookies which are then displayed to the user on the next page that they visit. Plugins can use this mechanism to display their own messages, see .add_message(request, message, type=datasette.INFO) for details. \n You can try out the new messages using the /-/messages debug tool, for example at https://latest.datasette.io/-/messages", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/790\", \"label\": \"#790\"}, {\"href\": \"https://latest.datasette.io/-/messages\", \"label\": \"https://latest.datasette.io/-/messages\"}]"} {"id": "changelog:signed-values-and-secrets", "page": "changelog", "ref": "signed-values-and-secrets", "title": "Signed values and secrets", "content": "Both flash messages and user authentication needed a way to sign values and set signed cookies. Two new methods are now available for plugins to take advantage of this mechanism: .sign(value, namespace=\"default\") and .unsign(value, namespace=\"default\") . \n Datasette will generate a secret automatically when it starts up, but to avoid resetting the secret (and hence invalidating any cookies) every time the server restarts you should set your own secret. You can pass a secret to Datasette using the new --secret option or with a DATASETTE_SECRET environment variable. See Configuring the secret for more details. \n You can also set a secret when you deploy Datasette using datasette publish or datasette package - see Using secrets with datasette publish . \n Plugins can now sign values and verify their signatures using the datasette.sign() and datasette.unsign() methods.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[]"} {"id": "changelog:csrf-protection", "page": "changelog", "ref": "csrf-protection", "title": "CSRF protection", "content": "Since writable canned queries are built using POST forms, Datasette now ships with CSRF protection ( #798 ). This applies automatically to any POST request, which means plugins need to include a csrftoken in any POST forms that they render. They can do that like so: \n ", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/798\", \"label\": \"#798\"}]"} {"id": "changelog:cookie-methods", "page": "changelog", "ref": "cookie-methods", "title": "Cookie methods", "content": "Plugins can now use the new response.set_cookie() method to set cookies. \n A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[]"} {"id": "changelog:register-routes-plugin-hooks", "page": "changelog", "ref": "register-routes-plugin-hooks", "title": "register_routes() plugin hooks", "content": "Plugins can now register new views and routes via the register_routes(datasette) plugin hook ( #819 ). View functions can be defined that accept any of the current datasette object, the current request , or the ASGI scope , send and receive objects.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/819\", \"label\": \"#819\"}]"} {"id": "changelog:id58", "page": "changelog", "ref": "id58", "title": "Smaller changes", "content": "New internals documentation for Request object and Response class . ( #706 ) \n \n \n request.url now respects the force_https_urls config setting. closes ( #781 ) \n \n \n request.args.getlist() returns [] if missing. Removed request.raw_args entirely. ( #774 ) \n \n \n New datasette.get_database() method. \n \n \n Added _ prefix to many private, undocumented methods of the Datasette class. ( #576 ) \n \n \n Removed the db.get_outbound_foreign_keys() method which duplicated the behaviour of db.foreign_keys_for_table() . \n \n \n New await datasette.permission_allowed() method. \n \n \n /-/actor debugging endpoint for viewing the currently authenticated actor. \n \n \n New request.cookies property. \n \n \n /-/plugins endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1 \n \n \n request.post_vars() method no longer discards empty values. \n \n \n New \"params\" canned query key for explicitly setting named parameters, see Canned query parameters . ( #797 ) \n \n \n request.args is now a MultiParams object. \n \n \n Fixed a bug with the datasette plugins command. ( #802 ) \n \n \n Nicer pattern for using make_app_client() in tests. ( #395 ) \n \n \n New request.actor property. \n \n \n Fixed broken CSS on nested 404 pages. ( #777 ) \n \n \n New request.url_vars property. ( #822 ) \n \n \n Fixed a bug with the python tests/fixtures.py command for outputting Datasette's testing fixtures database and plugins. ( #804 ) \n \n \n datasette publish heroku now deploys using Python 3.8.3. \n \n \n Added a warning that the register_facet_classes() hook is unstable and may change in the future. ( #830 ) \n \n \n The {\"$env\": \"ENVIRONMENT_VARIBALE\"} mechanism (see Secret configuration values ) now works with variables inside nested lists. ( #837 )", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/706\", \"label\": \"#706\"}, {\"href\": \"https://github.com/simonw/datasette/issues/781\", \"label\": \"#781\"}, {\"href\": \"https://github.com/simonw/datasette/issues/774\", \"label\": \"#774\"}, {\"href\": \"https://github.com/simonw/datasette/issues/576\", \"label\": \"#576\"}, {\"href\": \"https://latest.datasette.io/-/plugins?all=1\", \"label\": \"https://latest.datasette.io/-/plugins?all=1\"}, {\"href\": \"https://github.com/simonw/datasette/issues/797\", \"label\": \"#797\"}, {\"href\": \"https://github.com/simonw/datasette/issues/802\", \"label\": \"#802\"}, {\"href\": \"https://github.com/simonw/datasette/issues/395\", \"label\": \"#395\"}, {\"href\": \"https://github.com/simonw/datasette/issues/777\", \"label\": \"#777\"}, {\"href\": \"https://github.com/simonw/datasette/issues/822\", \"label\": \"#822\"}, {\"href\": \"https://github.com/simonw/datasette/issues/804\", \"label\": \"#804\"}, {\"href\": \"https://github.com/simonw/datasette/issues/830\", \"label\": \"#830\"}, {\"href\": \"https://github.com/simonw/datasette/issues/837\", \"label\": \"#837\"}]"} {"id": "changelog:the-road-to-datasette-1-0", "page": "changelog", "ref": "the-road-to-datasette-1-0", "title": "The road to Datasette 1.0", "content": "I've assembled a milestone for Datasette 1.0 . The focus of the 1.0 release will be the following: \n \n \n Signify confidence in the quality/stability of Datasette \n \n \n Give plugin authors confidence that their plugins will work for the whole 1.x release cycle \n \n \n Provide the same confidence to developers building against Datasette JSON APIs \n \n \n If you have thoughts about what you would like to see for Datasette 1.0 you can join the conversation on issue #519 .", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/milestone/7\", \"label\": \"milestone for Datasette 1.0\"}, {\"href\": \"https://github.com/simonw/datasette/issues/519\", \"label\": \"the conversation on issue #519\"}]"} {"id": "changelog:magic-parameters-for-canned-queries", "page": "changelog", "ref": "magic-parameters-for-canned-queries", "title": "Magic parameters for canned queries", "content": "Canned queries now support Magic parameters , which can be used to insert or select automatically generated values. For example: \n insert into logs\n (user_id, timestamp)\nvalues\n (:_actor_id, :_now_datetime_utc) \n This inserts the currently authenticated actor ID and the current datetime. ( #842 )", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/842\", \"label\": \"#842\"}]"} {"id": "changelog:log-out", "page": "changelog", "ref": "log-out", "title": "Log out", "content": "The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism ) to authenticate users. The new /-/logout page provides a way to clear that cookie. \n A \"Log out\" button now shows in the global navigation provided the user is authenticated using the ds_actor cookie. ( #840 )", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/840\", \"label\": \"#840\"}]"} {"id": "changelog:better-plugin-documentation", "page": "changelog", "ref": "better-plugin-documentation", "title": "Better plugin documentation", "content": "The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 ) \n \n \n Plugins introduces Datasette's plugin system and describes how to install and configure plugins. \n \n \n Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template. \n \n \n Plugin hooks is a full list of detailed documentation for every Datasette plugin hook. \n \n \n Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX .", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/687\", \"label\": \"#687\"}, {\"href\": \"https://github.com/simonw/datasette-plugin\", \"label\": \"datasette-plugin\"}, {\"href\": \"https://docs.pytest.org/\", \"label\": \"pytest\"}, {\"href\": \"https://www.python-httpx.org/\", \"label\": \"HTTPX\"}]"} {"id": "changelog:new-plugin-hooks", "page": "changelog", "ref": "new-plugin-hooks", "title": "New plugin hooks", "content": "register_magic_parameters(datasette) can be used to define new types of magic canned query parameters. \n \n \n startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. ( #834 ) \n \n \n canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. ( #852 ) \n \n \n forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. ( #812 )", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-init\", \"label\": \"datasette-init\"}, {\"href\": \"https://github.com/simonw/datasette/issues/834\", \"label\": \"#834\"}, {\"href\": \"https://github.com/simonw/datasette-saved-queries\", \"label\": \"datasette-saved-queries\"}, {\"href\": \"https://github.com/simonw/datasette/issues/852\", \"label\": \"#852\"}, {\"href\": \"https://github.com/simonw/datasette/issues/812\", \"label\": \"#812\"}]"} {"id": "changelog:id56", "page": "changelog", "ref": "id56", "title": "Smaller changes", "content": "Cascading view permissions - so if a user has view-table they can view the table page even if they do not have view-database or view-instance . ( #832 ) \n \n \n CSRF protection no longer applies to Authentication: Bearer token requests or requests without cookies. ( #835 ) \n \n \n datasette.add_message() now works inside plugins. ( #864 ) \n \n \n Workaround for \"Too many open files\" error in test runs. ( #846 ) \n \n \n Respect existing scope[\"actor\"] if already set by ASGI middleware. ( #854 ) \n \n \n New process for shipping Alpha and beta releases . ( #807 ) \n \n \n {{ csrftoken() }} now works when plugins render a template using datasette.render_template(..., request=request) . ( #863 ) \n \n \n Datasette now creates a single Request object and uses it throughout the lifetime of the current HTTP request. ( #870 )", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/832\", \"label\": \"#832\"}, {\"href\": \"https://github.com/simonw/datasette/issues/835\", \"label\": \"#835\"}, {\"href\": \"https://github.com/simonw/datasette/issues/864\", \"label\": \"#864\"}, {\"href\": \"https://github.com/simonw/datasette/issues/846\", \"label\": \"#846\"}, {\"href\": \"https://github.com/simonw/datasette/issues/854\", \"label\": \"#854\"}, {\"href\": \"https://github.com/simonw/datasette/issues/807\", \"label\": \"#807\"}, {\"href\": \"https://github.com/simonw/datasette/issues/863\", \"label\": \"#863\"}, {\"href\": \"https://github.com/simonw/datasette/issues/870\", \"label\": \"#870\"}]"} {"id": "changelog:new-visual-design", "page": "changelog", "ref": "new-visual-design", "title": "New visual design", "content": "Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. ( #1056 )", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://twitter.com/natbat\", \"label\": \"Natalie Downe\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1056\", \"label\": \"#1056\"}]"} {"id": "changelog:plugins-can-now-add-links-within-datasette", "page": "changelog", "ref": "plugins-can-now-add-links-within-datasette", "title": "Plugins can now add links within Datasette", "content": "A number of existing Datasette plugins add new pages to the Datasette interface, providig tools for things like uploading CSVs , editing table schemas or configuring full-text search . \n Plugins like this can now link to themselves from other parts of Datasette interface. The menu_links(datasette, actor, request) hook ( #1064 ) lets plugins add links to Datasette's new top-right application menu, and the table_actions(datasette, actor, database, table, request) hook ( #1066 ) adds links to a new \"table actions\" menu on the table page. \n The demo at latest.datasette.io now includes some example plugins. To see the new table actions menu first sign into that demo as root and then visit the facetable table to see the new cog icon menu at the top of the page.", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-upload-csvs\", \"label\": \"uploading CSVs\"}, {\"href\": \"https://github.com/simonw/datasette-edit-schema\", \"label\": \"editing table schemas\"}, {\"href\": \"https://github.com/simonw/datasette-configure-fts\", \"label\": \"configuring full-text search\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1064\", \"label\": \"#1064\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1066\", \"label\": \"#1066\"}, {\"href\": \"https://latest.datasette.io/\", \"label\": \"latest.datasette.io\"}, {\"href\": \"https://latest.datasette.io/login-as-root\", \"label\": \"sign into that demo as root\"}, {\"href\": \"https://latest.datasette.io/fixtures/facetable\", \"label\": \"facetable\"}]"} {"id": "changelog:binary-data", "page": "changelog", "ref": "binary-data", "title": "Binary data", "content": "SQLite tables can contain binary data in BLOB columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See Binary data for more details. ( #1036 and #1034 ).", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1036\", \"label\": \"#1036\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1034\", \"label\": \"#1034\"}]"} {"id": "changelog:url-building", "page": "changelog", "ref": "url-building", "title": "URL building", "content": "The new datasette.urls family of methods can be used to generate URLs to key pages within the Datasette interface, both within custom templates and Datasette plugins. See Building URLs within plugins for more details. ( #904 )", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/904\", \"label\": \"#904\"}]"} {"id": "changelog:running-datasette-behind-a-proxy", "page": "changelog", "ref": "running-datasette-behind-a-proxy", "title": "Running Datasette behind a proxy", "content": "The base_url configuration option is designed to help run Datasette on a specific path behind a proxy - for example if you want to run an instance of Datasette at /my-datasette/ within your existing site's URL hierarchy, proxied behind nginx or Apache. \n Support for this configuration option has been greatly improved ( #1023 ), and guidelines for using it are now available in a new documentation section on Running Datasette behind a proxy . ( #1027 )", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1023\", \"label\": \"#1023\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1027\", \"label\": \"#1027\"}]"} {"id": "changelog:smaller-changes", "page": "changelog", "ref": "smaller-changes", "title": "Smaller changes", "content": "Wide tables shown within Datasette now scroll horizontally ( #998 ). This is achieved using a new
element which may impact the implementation of some plugins (for example this change to datasette-cluster-map ). \n \n \n New debug-menu permission. ( #1068 ) \n \n \n Removed --debug option, which didn't do anything. ( #814 ) \n \n \n Link: HTTP header pagination. ( #1014 ) \n \n \n x button for clearing filters. ( #1016 ) \n \n \n Edit SQL button on canned queries, ( #1019 ) \n \n \n --load-extension=spatialite shortcut. ( #1028 ) \n \n \n scale-in animation for column action menu. ( #1039 ) \n \n \n Option to pass a list of templates to .render_template() is now documented. ( #1045 ) \n \n \n New datasette.urls.static_plugins() method. ( #1033 ) \n \n \n datasette -o option now opens the most relevant page. ( #976 ) \n \n \n datasette --cors option now enables access to /database.db downloads. ( #1057 ) \n \n \n Database file downloads now implement cascading permissions, so you can download a database if you have view-database-download permission even if you do not have permission to access the Datasette instance. ( #1058 ) \n \n \n New documentation on Designing URLs for your plugin . ( #1053 )", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/998\", \"label\": \"#998\"}, {\"href\": \"https://github.com/simonw/datasette-cluster-map/commit/fcb4abbe7df9071c5ab57defd39147de7145b34e\", \"label\": \"this change to datasette-cluster-map\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1068\", \"label\": \"#1068\"}, {\"href\": \"https://github.com/simonw/datasette/issues/814\", \"label\": \"#814\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1014\", \"label\": \"#1014\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1016\", \"label\": \"#1016\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1019\", \"label\": \"#1019\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1028\", \"label\": \"#1028\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1039\", \"label\": \"#1039\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1045\", \"label\": \"#1045\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1033\", \"label\": \"#1033\"}, {\"href\": \"https://github.com/simonw/datasette/issues/976\", \"label\": \"#976\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1057\", \"label\": \"#1057\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1058\", \"label\": \"#1058\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1053\", \"label\": \"#1053\"}]"} {"id": "changelog:the-internal-database", "page": "changelog", "ref": "the-internal-database", "title": "The _internal database", "content": "As part of ongoing work to help Datasette handle much larger numbers of connected databases and tables (see Datasette Library ) Datasette now maintains an in-memory SQLite database with details of all of the attached databases, tables, columns, indexes and foreign keys. ( #1150 ) \n This will support future improvements such as a searchable, paginated homepage of all available tables. \n You can explore an example of this database by signing in as root to the latest.datasette.io demo instance and then navigating to latest.datasette.io/_internal . \n Plugins can use these tables to introspect attached data in an efficient way. Plugin authors should note that this is not yet considered a stable interface, so any plugins that use this may need to make changes prior to Datasette 1.0 if the _internal table schemas change.", "breadcrumbs": "[\"Changelog\", \"0.54 (2021-01-25)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/417\", \"label\": \"Datasette Library\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1150\", \"label\": \"#1150\"}, {\"href\": \"https://latest.datasette.io/login-as-root\", \"label\": \"signing in as root\"}, {\"href\": \"https://latest.datasette.io/_internal\", \"label\": \"latest.datasette.io/_internal\"}]"} {"id": "changelog:named-in-memory-database-support", "page": "changelog", "ref": "named-in-memory-database-support", "title": "Named in-memory database support", "content": "As part of the work building the _internal database, Datasette now supports named in-memory databases that can be shared across multiple connections. This allows plugins to create in-memory databases which will persist data for the lifetime of the Datasette server process. ( #1151 ) \n The new memory_name= parameter to the Database class can be used to create named, shared in-memory databases.", "breadcrumbs": "[\"Changelog\", \"0.54 (2021-01-25)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1151\", \"label\": \"#1151\"}]"} {"id": "changelog:javascript-modules", "page": "changelog", "ref": "javascript-modules", "title": "JavaScript modules", "content": "JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import and export keywords. \n To use modules, JavaScript needs to be included in