From 2a24a930d64255d99c1e755e0501a74ba304fcf0 Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Fri, 23 Sep 2022 13:49:31 -0400 Subject: [PATCH 01/10] #8979 add summary of features/fixes --- doc/release-notes/5.12-release-notes.md | 74 +++++++++++++++++++++++++ 1 file changed, 74 insertions(+) create mode 100644 doc/release-notes/5.12-release-notes.md diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md new file mode 100644 index 00000000000..f2d7b474ed6 --- /dev/null +++ b/doc/release-notes/5.12-release-notes.md @@ -0,0 +1,74 @@ +# Dataverse Software 5.12 + +This release brings new features, enhancements, and bug fixes to the Dataverse Software. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project. + +## Release Highlights + +### Harvard Data Commons Additions + +As reported at the 2022 Dataverse Community Meeting, the [Harvard Data Commons](https://sites.harvard.edu/harvard-data-commons/) project has supported a wide range of additions to the Dataverse software that improve support for Big Data, Workflows, Archiving, and interaction with other repositories. In many cases, these additions build upon features developed within the Dataverse community by Borealis, DANS, QDR, TDL, and others. Highlights from this work include: + +- Initial support for Globus file transfer to upload to and download from a Dataverse managed S3 store. The current implementation disables file restriction and embargo on Globus-enabled stores. +- Initial support for Remote File Storage. This capability, enabled via a new RemoteOverlay store type, allows a file stored in a remote system to be added to a dataset (currently only via API) with download requests redirected to the remote system. Use cases include referencing public files hosted on external web servers as well as support for controlled access managed by Dataverse (e.g. via restricted and embargoed status) and/or by the remote store. +- Initial support for computational workflows, including a new metadata block and detected filetypes. +- Support for archiving to any S3 store using Dataverse's RDA-conformant BagIT file format (a BagPack). +- Improved error handling and performance in archival bag creation and new options such as only supporting archiving of one dataset version. +- Additions/corrections to the OAI-ORE metadata format (which is included in archival bags) such as referencing the name/mimetype/size/checksum/download URL of the original file for ingested files, the inclusion of metadata about the parent collection(s) of an archived dataset version, and use of the URL form of PIDs. +- Display of archival status within the dataset page versions table, richer status options including success, pending, and failure states, with a complete API for managing archival status. +- Support for batch archiving via API as an alternative to the current options of configuring archiving upon publication or archiving each dataset version manually. +- Initial support for sending and receiving Linked Data Notification messages indicating relationships between a dataset and external resources (e.g. papers or other dataset) that can be used to trigger additional actions, such as the creation of a back-link to provide, for example, bi-directional linking between a published paper and a Dataverse dataset. +- A new capability to provide custom per field instructions in dataset templates + +### Update to Payara 5.2022.3 highly recommended + +With lots of bug and security fixes included, we encourage everyone to update to Payara 5.2022.3 as soon as possible. + +### Improvements to fields that appear in the Citation metadata block + +Grammar, style and consistency improvements have been made to the titles, tooltip description text, and watermarks of metadata fields that appear in the Citation metadata block. + +This includes fields that dataset depositors can edit in the Citation Metadata accordion (i.e. fields controlled by the citation.tsv and citation.properties files) and fields whose values are system-generated, such as the Dataset Persistent ID, Previous Dataset Persistent ID, and Publication Date fields whose titles and tooltips are configured in the bundles.properties file. + +The changes should provide clearer information to curators, depositors, and people looking for data about what the fields are for. + +A new page in the Style Guides called "Text" has also been added. The new page includes a section called "Metadata Text Guidelines" with a link to a Google Doc where the guidelines are being maintained for now since we expect them to be revised frequently. + +### Adding new static search facet: Metadata Types +A new static search facet has been added to the search side panel. This new facet is called "Metadata Types" and is driven from metadata blocks. When a metadata field value is inserted into a dataset, an entry for the metadata block it belongs to is added to this new facet. + +This new facet needs to be configured for it to appear on the search side panel. The configuration assigns to a dataverse what metadata blocks to show. The configuration is inherited by child dataverses. + +To configure the new facet, use the Metadata Block Facet API: + +### Broader MicroProfile Config Support for Developers + +As of this release, many [JVM options](https://guides.dataverse.org/en/latest/installation/config.html#jvm-options) +can be set using any [MicroProfile Config Source](https://docs.payara.fish/community/docs/Technical%20Documentation/MicroProfile/Config/Overview.html#config-sources). + +Currently this change is only relevant to developers but as settings are migrated to the new "lookup" pattern documented in the [Consuming Configuration](https://guides.dataverse.org/en/latest/developers/configuration.html) section of the Developer Guide, anyone installing the Dataverse software will have much greater flexibility when configuring those settings, especially within containers. These changes will be announced in future releases. + +Please note that an upgrade to Payara 5.2021.8 or higher is required to make use of this. Payara 5.2021.5 threw exceptions, as explained in PR #8823. + +## Major Use Cases and Infrastructure Enhancements + +Changes and fixes in this release include: + +- Administrators can configure an S3 store used in Dataverse to support users uploading/downloading files via Globus File Transfer. (PR #8891) +- Administrators can configure a RemoteOverlay store to allow files that remain hosted by a remote system to be added to a dataset. (PR #7325) +- Administrators can configure the Dataverse software to send archival Bag copies of published dataset versions to any S3-compatible service. (PR #8751) +- Users can see information about a dataset's parent collection(s) in the OAI-ORE metadata export. (PR #8770) +- Users and administrators can now use the OAI-ORE metadata export to retrieve and assess the fixity of the original file (for ingested tabular files) via the included checksum. (PR #8901) +- Archiving via RDA-conformant Bags is more robust and is more configurable. (PR #8773, #8747, #8699, #8609, #8606, #8610) +- Users and administrators can see the archival status of the versions of the datasets they manage in the dataset page version table. (PR #8748, #8696) +- Administrators can configure messaging between their Dataverse installation and other repositories that may hold related resources or services interested in activity within that installation. (PR #8775) +- Collection managers can create templates that include custom instructions on how to fill out specific metadata fields. +- Dataset update API users are given more information when the dataset they are updating is out of compliance with Terms of Access requirements (Issue #8859) +- Adds a new setting (:ControlledVocabularyCustomJavaScript) that allows a JavaScript file to be loaded into the dataset page for the purpose of showing controlled vocabulary as a list (Issue #8722) +- Fixes an issue with the Redetect File Type API (Issue #7527) +- Updates the Import Dataset DDI API to allow the import of Terms of Use (Issue #8715) +- Optimizes some code to improve application memory usage (Issue #8871) +- Fixes sample data to reflect custom licenses. +- Fixes the Archival Status Input API (available to superusers) (Issue #8924) +- Small bugs have been fixed in the dataset export in the JSON and DDI formats; eliminating the export of "undefined" as a metadata language in the former, and a duplicate keyword tag in the latter. (Issue #8868) + + From 24705f82f46454d227ee69db432fedd10c607a76 Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Mon, 26 Sep 2022 10:56:23 -0400 Subject: [PATCH 02/10] #8979 add installation instructions, etc. --- doc/release-notes/5.12-release-notes.md | 97 +++++++++++++++++++++++++ 1 file changed, 97 insertions(+) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index f2d7b474ed6..6ac1c81efd9 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -18,6 +18,7 @@ As reported at the 2022 Dataverse Community Meeting, the [Harvard Data Commons]( - Support for batch archiving via API as an alternative to the current options of configuring archiving upon publication or archiving each dataset version manually. - Initial support for sending and receiving Linked Data Notification messages indicating relationships between a dataset and external resources (e.g. papers or other dataset) that can be used to trigger additional actions, such as the creation of a back-link to provide, for example, bi-directional linking between a published paper and a Dataverse dataset. - A new capability to provide custom per field instructions in dataset templates +- Additional workflow file extensions are now detected. ### Update to Payara 5.2022.3 highly recommended @@ -71,4 +72,100 @@ Changes and fixes in this release include: - Fixes the Archival Status Input API (available to superusers) (Issue #8924) - Small bugs have been fixed in the dataset export in the JSON and DDI formats; eliminating the export of "undefined" as a metadata language in the former, and a duplicate keyword tag in the latter. (Issue #8868) +## New DB Settings +The following DB settings have been added: +- `:ShibAffiliationOrder` - Select the first or last entry in an Affiliation array +- `:ShibAffiliationSeparator` (default: ";") - Set the separator for the Affiliation array +## Installation + +If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.12/installation/). Please also contact us to get added to the [Dataverse Project Map](https://guides.dataverse.org/en/5.10/installation/config.html#putting-your-dataverse-installation-on-the-map-at-dataverse-org) if you have not done so already. + +## Upgrade Instructions + +0\. These instructions assume that you've already successfully upgraded from Dataverse Software 4.x to Dataverse Software 5 following the instructions in the [Dataverse Software 5 Release Notes](https://github.com/IQSS/dataverse/releases/tag/v5.0). After upgrading from the 4.x series to 5.0, you should progress through the other 5.x releases before attempting the upgrade to 5.10. + +If you are running Payara as a non-root user (and you should be!), **remember not to execute the commands below as root**. Use `sudo` to change to that user first. For example, `sudo -i -u dataverse` if `dataverse` is your dedicated application user. + +In the following commands we assume that Payara 5 is installed in `/usr/local/payara5`. If not, adjust as needed. + + +# Update to Payara 5.2022.3 highly recommended + +With lots of bug and security fixes included, we encourage everyone to update to Payara 5.2022.3 as soon as possible. + +**Note:** with the approaching EOL for the Payara 5 Community release train it's likely we will switch to a +yet-to-be-released Payara 6 in the not-so-far-away future. + +We recommend you ensure you followed all update instructions from the past releases regarding Payara. +(latest Payara update was for [v5.6](https://github.com/IQSS/dataverse/releases/tag/v5.6)) + +Upgrading requires a maintenance window and downtime. Please plan ahead, create backups of your database, etc. + +The steps below are a simple matter of reusing your existing domain directory with the new distribution. +But we also recommend that you review the Payara upgrade instructions as it could be helpful during any troubleshooting: +[Payara Release Notes](https://docs.payara.fish/community/docs/Release%20Notes/Release%20Notes%205.2022.3.html) + + +Please note that the deletion of the `lib/databases` directory below is only required once, for this upgrade (see Issue #8230 for details). + +```shell +export PAYARA=/usr/local/payara5 +``` + +(or `setenv PAYARA /usr/local/payara5` if you are using a `csh`-like shell) + +1\. Undeploy the previous version + +```shell + $PAYARA/bin/asadmin list-applications + $PAYARA/bin/asadmin undeploy dataverse<-version> +``` + +2\. Stop Payara + +```shell + service payara stop + rm -rf $PAYARA/glassfish/domains/domain1/generated + rm -rf $PAYARA/glassfish/domains/domain1/osgi-cache + rm -rf $PAYARA/glassfish/domains/domain1/lib/databases +``` + +3\. Move the current Payara directory out of the way + +```shell + mv $PAYARA $PAYARA.MOVED +``` + +4\. Download the new Payara version (5.2022.3), and unzip it in its place + +5\. Replace the brand new payara/glassfish/domains/domain1 with your old, preserved domain1 + +6\. Start Payara + +```shell + service payara start +``` + +7\. Deploy this version. + +```shell + $PAYARA/bin/asadmin deploy dataverse-5.12.war +``` + +8\. Restart payara + +```shell + service payara stop + service payara start +``` +### Additional Upgrade Steps + +Update the Citation metadata block: + +- `wget https://github.com/IQSS/dataverse/releases/download/v#.##/citation.tsv` +- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"` + +- Re-export metadata files (OAI_ORE is affected by the PRs in these release notes). Optionally, for those using the Dataverse software's BagIt-based archiving, re-archive dataset versions archived using prior versions of the Dataverse software. This will be recommended/required in a future release. + +- Run ReExportall to update Exports \ No newline at end of file From 6a923ccd609b9010765bdbead608a6a02066376b Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Tue, 27 Sep 2022 09:54:26 -0400 Subject: [PATCH 03/10] #8979 addl Data Commons notes --- doc/release-notes/5.12-release-notes.md | 25 +++++++++++++++++++++++++ 1 file changed, 25 insertions(+) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index 6ac1c81efd9..ce304dab263 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -76,6 +76,31 @@ Changes and fixes in this release include: The following DB settings have been added: - `:ShibAffiliationOrder` - Select the first or last entry in an Affiliation array - `:ShibAffiliationSeparator` (default: ";") - Set the separator for the Affiliation array +- `:LDNMessageHosts` +- `:GlobusBasicToken` +- `:GlobusEndpoint` +- `:GlobusStores` +- `:GlobusAppUrl` +- `:GlobusPollingInterval` +- `:GlobusSingleFileTransfer` +- `:S3ArchiverConfig` +- `:S3ArchiverProfile` +- `:DRSArchiverConfig` + +See the [Database Settings](https://guides.dataverse.org/en/5.12/installation/config.html#database-settings) section of the Guides for more information. + +## Notes for Developers and Integrators + +See the "Backward Incompatibilities" section below. + +## Backward Incompatibilities + +### OAI-ORE and Archiving Changes + +The Admin API call to manually sumbit a dataset version for archiving has changed to require POST instead of GET and to have a name making it clearer that archiving is being done for a given dataset version: /api/admin/submitDatasetVersionToArchive. + +Earlier versions of the archival bags included the ingested (tab-separated-value) version of tabular files while providing the checksum of the original file (Issue #8449). This release fixes that by including the original file and its metadata in the archival bag. This means that archival bags created prior to this version do not include a way to validate ingested files. Further, it is likely that capabilities in development (i.e. as part of the [Dataverse Uploader](https://github/org/GlobalDataverseCommunityConsortium/dataverse-uploader) to allow re-creation of a dataset version from an archival bag will only be fully compatible with archival bags generated by a Dataverse instance at a release > v5.12. (Specifically, at a minimum, since only the ingested file is included in earlier archival bags, an upload via DVUploader would not result in the same original file/ingested version as in the original dataset.) Administrators should be aware that re-creating archival bags, i.e. via the new batch archiving API, may be advisable now and will be recommended at some point in the future (i.e. there will be a point where we will start versioning archival bags and will start maintaining backward compatibility for older versions as part of transitioning this from being an experimental capability). + ## Installation From d693bd0c5c42f771eca407b5988831eed3855d66 Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Tue, 27 Sep 2022 10:01:17 -0400 Subject: [PATCH 04/10] #8979 fix typo --- doc/release-notes/5.12-release-notes.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index ce304dab263..5b0d145bc2a 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -97,7 +97,7 @@ See the "Backward Incompatibilities" section below. ### OAI-ORE and Archiving Changes -The Admin API call to manually sumbit a dataset version for archiving has changed to require POST instead of GET and to have a name making it clearer that archiving is being done for a given dataset version: /api/admin/submitDatasetVersionToArchive. +The Admin API call to manually submit a dataset version for archiving has changed to require POST instead of GET and to have a name making it clearer that archiving is being done for a given dataset version: /api/admin/submitDatasetVersionToArchive. Earlier versions of the archival bags included the ingested (tab-separated-value) version of tabular files while providing the checksum of the original file (Issue #8449). This release fixes that by including the original file and its metadata in the archival bag. This means that archival bags created prior to this version do not include a way to validate ingested files. Further, it is likely that capabilities in development (i.e. as part of the [Dataverse Uploader](https://github/org/GlobalDataverseCommunityConsortium/dataverse-uploader) to allow re-creation of a dataset version from an archival bag will only be fully compatible with archival bags generated by a Dataverse instance at a release > v5.12. (Specifically, at a minimum, since only the ingested file is included in earlier archival bags, an upload via DVUploader would not result in the same original file/ingested version as in the original dataset.) Administrators should be aware that re-creating archival bags, i.e. via the new batch archiving API, may be advisable now and will be recommended at some point in the future (i.e. there will be a point where we will start versioning archival bags and will start maintaining backward compatibility for older versions as part of transitioning this from being an experimental capability). From 678e0d1af347fa10ad01e6f50e1967641ce1040b Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Tue, 27 Sep 2022 10:05:56 -0400 Subject: [PATCH 05/10] #8979 add experimental note --- doc/release-notes/5.12-release-notes.md | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index 5b0d145bc2a..8a430817991 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -89,6 +89,12 @@ The following DB settings have been added: See the [Database Settings](https://guides.dataverse.org/en/5.12/installation/config.html#database-settings) section of the Guides for more information. +## Notes for Dataverse Installation Administrators + +### Enabling Experimental Capabilities + +Several of the capabilities introduced in v5.12 are "experimental" in the sense that further changes and enhancements to these capabilities should be expected and that these changes may involve additional work, for those who use the initial implementations, when upgrading to newer versions of the Dataverse software. Administrators wishing to use them are encouraged to stay in touch, e.g. via the Dataverse Community Slack space, to understand the limits of current capabilities and to plan for future upgrades. + ## Notes for Developers and Integrators See the "Backward Incompatibilities" section below. From 04adc496846d9d40162ff49fdd770c9f648f4139 Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Wed, 28 Sep 2022 10:44:55 -0400 Subject: [PATCH 06/10] #8979 formatting, etc. --- doc/release-notes/5.12-release-notes.md | 39 +++++++++++++++++++++++-- 1 file changed, 37 insertions(+), 2 deletions(-) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index 8a430817991..df7b551ce83 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -18,7 +18,13 @@ As reported at the 2022 Dataverse Community Meeting, the [Harvard Data Commons]( - Support for batch archiving via API as an alternative to the current options of configuring archiving upon publication or archiving each dataset version manually. - Initial support for sending and receiving Linked Data Notification messages indicating relationships between a dataset and external resources (e.g. papers or other dataset) that can be used to trigger additional actions, such as the creation of a back-link to provide, for example, bi-directional linking between a published paper and a Dataverse dataset. - A new capability to provide custom per field instructions in dataset templates -- Additional workflow file extensions are now detected. +- The following file extensions are now detected: + wdl=text/x-workflow-description-language + cwl=text/x-computational-workflow-language + nf=text/x-nextflow + Rmd=text/x-r-notebook + rb=text/x-ruby-script + dag=text/x-dagman ### Update to Payara 5.2022.3 highly recommended @@ -35,6 +41,7 @@ The changes should provide clearer information to curators, depositors, and peop A new page in the Style Guides called "Text" has also been added. The new page includes a section called "Metadata Text Guidelines" with a link to a Google Doc where the guidelines are being maintained for now since we expect them to be revised frequently. ### Adding new static search facet: Metadata Types + A new static search facet has been added to the search side panel. This new facet is called "Metadata Types" and is driven from metadata blocks. When a metadata field value is inserted into a dataset, an entry for the metadata block it belongs to is added to this new facet. This new facet needs to be configured for it to appear on the search side panel. The configuration assigns to a dataverse what metadata blocks to show. The configuration is inherited by child dataverses. @@ -50,6 +57,32 @@ Currently this change is only relevant to developers but as settings are migrate Please note that an upgrade to Payara 5.2021.8 or higher is required to make use of this. Payara 5.2021.5 threw exceptions, as explained in PR #8823. +### HTTP Range Requests: New HTTP status codes and headers for Datafile Access API + +The Basic File Access resource for datafiles (/api/access/datafile/$id) was slightly modified in order to comply better with the HTTP specification for range requests. + +If the request contains a "Range" header: +* The returned HTTP status is now 206 (Partial Content) instead of 200 +* A "Content-Range" header is returned containing information about the returned bytes +* An "Accept-Ranges" header with value "bytes" is returned + +CORS rules/headers were modified accordingly: +* The "Range" header is added to "Access-Control-Allow-Headers" +* The "Content-Range" and "Accept-Ranges" header are added to "Access-Control-Expose-Headers" + +### File types detection + +File types are now detected based on the filename when the file has no extension. + +The following filenames are now detected: + +- Makefile=text/x-makefile +- Snakemake=text/x-snakemake +- Dockerfile=application/x-docker-file +- Vagrantfile=application/x-vagrant-file + +These are defined in `MimeTypeDetectionByFileName.properties`. + ## Major Use Cases and Infrastructure Enhancements Changes and fixes in this release include: @@ -66,13 +99,14 @@ Changes and fixes in this release include: - Dataset update API users are given more information when the dataset they are updating is out of compliance with Terms of Access requirements (Issue #8859) - Adds a new setting (:ControlledVocabularyCustomJavaScript) that allows a JavaScript file to be loaded into the dataset page for the purpose of showing controlled vocabulary as a list (Issue #8722) - Fixes an issue with the Redetect File Type API (Issue #7527) -- Updates the Import Dataset DDI API to allow the import of Terms of Use (Issue #8715) +- Terms of Use is now imported when using DDI format through harvesting or the native API. (Issue #8715, PR #8743) - Optimizes some code to improve application memory usage (Issue #8871) - Fixes sample data to reflect custom licenses. - Fixes the Archival Status Input API (available to superusers) (Issue #8924) - Small bugs have been fixed in the dataset export in the JSON and DDI formats; eliminating the export of "undefined" as a metadata language in the former, and a duplicate keyword tag in the latter. (Issue #8868) ## New DB Settings + The following DB settings have been added: - `:ShibAffiliationOrder` - Select the first or last entry in an Affiliation array - `:ShibAffiliationSeparator` (default: ";") - Set the separator for the Affiliation array @@ -86,6 +120,7 @@ The following DB settings have been added: - `:S3ArchiverConfig` - `:S3ArchiverProfile` - `:DRSArchiverConfig` +- `:ControlledVocabularyCustomJavaScript` See the [Database Settings](https://guides.dataverse.org/en/5.12/installation/config.html#database-settings) section of the Guides for more information. From f979e004e81ba7086898f36eea402013542619ca Mon Sep 17 00:00:00 2001 From: landreev Date: Wed, 28 Sep 2022 16:56:38 -0400 Subject: [PATCH 07/10] Update 5.12-release-notes.md adding the 5.12 version to the tsv download url --- doc/release-notes/5.12-release-notes.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index df7b551ce83..100755d3588 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -229,9 +229,9 @@ export PAYARA=/usr/local/payara5 Update the Citation metadata block: -- `wget https://github.com/IQSS/dataverse/releases/download/v#.##/citation.tsv` +- `wget https://github.com/IQSS/dataverse/releases/download/v5.12/citation.tsv` - `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"` - Re-export metadata files (OAI_ORE is affected by the PRs in these release notes). Optionally, for those using the Dataverse software's BagIt-based archiving, re-archive dataset versions archived using prior versions of the Dataverse software. This will be recommended/required in a future release. -- Run ReExportall to update Exports \ No newline at end of file +- Run ReExportall to update Exports From 83171d00ca2700f5718d31462bc00c0ddf5df3f3 Mon Sep 17 00:00:00 2001 From: Stephen Kraffmiller Date: Wed, 28 Sep 2022 17:00:29 -0400 Subject: [PATCH 08/10] #8979 Remove individual issue notes --- doc/release-notes/7000-mpconfig-support.md | 8 -- .../8127-citation-field-improvements.md | 16 ---- .../8535-metadata-types-static-facet.md | 6 -- .../8611-DataCommons-related-notes.md | 81 ------------------- .../8639-computational-workflow.md | 8 -- doc/release-notes/8715-importddi-termofuse.md | 1 - doc/release-notes/8722-custom-script.md | 3 - .../8727-better-http-range-request-support.md | 12 --- ...8740-file-recognition-based-on-filename.md | 12 --- ...59-add-computational-worflow-file-types.md | 10 --- doc/release-notes/8868-fix-json-import.md | 7 -- doc/release-notes/8882-shib-affiliation.md | 4 - doc/release-notes/8947-payara-update.md | 76 ----------------- 13 files changed, 244 deletions(-) delete mode 100644 doc/release-notes/7000-mpconfig-support.md delete mode 100644 doc/release-notes/8127-citation-field-improvements.md delete mode 100644 doc/release-notes/8535-metadata-types-static-facet.md delete mode 100644 doc/release-notes/8611-DataCommons-related-notes.md delete mode 100644 doc/release-notes/8639-computational-workflow.md delete mode 100644 doc/release-notes/8715-importddi-termofuse.md delete mode 100644 doc/release-notes/8722-custom-script.md delete mode 100644 doc/release-notes/8727-better-http-range-request-support.md delete mode 100644 doc/release-notes/8740-file-recognition-based-on-filename.md delete mode 100644 doc/release-notes/8759-add-computational-worflow-file-types.md delete mode 100644 doc/release-notes/8868-fix-json-import.md delete mode 100644 doc/release-notes/8882-shib-affiliation.md delete mode 100644 doc/release-notes/8947-payara-update.md diff --git a/doc/release-notes/7000-mpconfig-support.md b/doc/release-notes/7000-mpconfig-support.md deleted file mode 100644 index 01f21caf37a..00000000000 --- a/doc/release-notes/7000-mpconfig-support.md +++ /dev/null @@ -1,8 +0,0 @@ -# Broader MicroProfile Config Support for Developers - -As of this release, many [JVM options](https://guides.dataverse.org/en/latest/installation/config.html#jvm-options) -can be set using any [MicroProfile Config Source](https://docs.payara.fish/community/docs/Technical%20Documentation/MicroProfile/Config/Overview.html#config-sources). - -Currently this change is only relevant to developers but as settings are migrated to the new "lookup" pattern documented in the [Consuming Configuration](https://guides.dataverse.org/en/latest/developers/configuration.html) section of the Developer Guide, anyone installing the Dataverse software will have much greater flexibility when configuring those settings, especially within containers. These changes will be announced in future releases. - -Please note that an upgrade to Payara 5.2021.8 or higher is required to make use of this. Payara 5.2021.5 threw exceptions, as explained in PR #8823. diff --git a/doc/release-notes/8127-citation-field-improvements.md b/doc/release-notes/8127-citation-field-improvements.md deleted file mode 100644 index c589145d01b..00000000000 --- a/doc/release-notes/8127-citation-field-improvements.md +++ /dev/null @@ -1,16 +0,0 @@ -### Improvements to fields that appear in the Citation metadata block - -Grammar, style and consistency improvements have been made to the titles, tooltip description text, and watermarks of metadata fields that appear in the Citation metadata block. - -This includes fields that dataset depositors can edit in the Citation Metadata accordion (i.e. fields controlled by the citation.tsv and citation.properties files) and fields whose values are system-generated, such as the Dataset Persistent ID, Previous Dataset Persistent ID, and Publication Date fields whose titles and tooltips are configured in the bundles.properties file. - -The changes should provide clearer information to curators, depositors, and people looking for data about what the fields are for. - -A new page in the Style Guides called "Text" has also been added. The new page includes a section called "Metadata Text Guidelines" with a link to a Google Doc where the guidelines are being maintained for now since we expect them to be revised frequently. - -### Additional Upgrade Steps - -Update the Citation metadata block: - -- `wget https://github.com/IQSS/dataverse/releases/download/v#.##/citation.tsv` -- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"` \ No newline at end of file diff --git a/doc/release-notes/8535-metadata-types-static-facet.md b/doc/release-notes/8535-metadata-types-static-facet.md deleted file mode 100644 index 023000c0977..00000000000 --- a/doc/release-notes/8535-metadata-types-static-facet.md +++ /dev/null @@ -1,6 +0,0 @@ -## Adding new static search facet: Metadata Types -A new static search facet has been added to the search side panel. This new facet is called "Metadata Types" and is driven from metadata blocks. When a metadata field value is inserted into a dataset, an entry for the metadata block it belongs to is added to this new facet. - -This new facet needs to be configured for it to appear on the search side panel. The configuration assigns to a dataverse what metadata blocks to show. The configuration is inherited by child dataverses. - -To configure the new facet, use the Metadata Block Facet API: \ No newline at end of file diff --git a/doc/release-notes/8611-DataCommons-related-notes.md b/doc/release-notes/8611-DataCommons-related-notes.md deleted file mode 100644 index af222db5b9f..00000000000 --- a/doc/release-notes/8611-DataCommons-related-notes.md +++ /dev/null @@ -1,81 +0,0 @@ -# Dataverse Software 5.12 - -This release brings new features, enhancements, and bug fixes to the Dataverse Software. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project. - -## Release Highlights - -### Harvard Data Commons Additions - -As reported at the 2022 Dataverse Community Meeting, the [Harvard Data Commons](https://sites.harvard.edu/harvard-data-commons/) project has supported a wide range of additions to the Dataverse software that improve support for Big Data, Workflows, Archiving, and interaction with other repositories. In many cases, these additions build upon features developed within the Dataverse community by Borealis, DANS, QDR, TDL, and others. Highlights from this work include: - -- Initial support for Globus file transfer to upload to and download from a Dataverse managed S3 store. The current implementation disables file restriction and embargo on Globus-enabled stores. -- Initial support for Remote File Storage. This capability, enabled via a new RemoteOverlay store type, allows a file stored in a remote system to be added to a dataset (currently only via API) with download requests redirected to the remote system. Use cases include referencing public files hosted on external web servers as well as support for controlled access managed by Dataverse (e.g. via restricted and embargoed status) and/or by the remote store. -- Initial support for computational workflows, including a new metadata block and detected filetypes. -- Support for archiving to any S3 store using Dataverse's RDA-conformant BagIT file format (a BagPack). -- Improved error handling and performance in archival bag creation and new options such as only supporting archiving of one dataset version. -- Additions/corrections to the OAI-ORE metadata format (which is included in archival bags) such as referencing the name/mimetype/size/checksum/download URL of the original file for ingested files, the inclusion of metadata about the parent collection(s) of an archived dataset version, and use of the URL form of PIDs. -- Display of archival status within the dataset page versions table, richer status options including success, pending, and failure states, with a complete API for managing archival status. -- Support for batch archiving via API as an alternative to the current options of configuring archiving upon publication or archiving each dataset version manually. -- Initial support for sending and receiving Linked Data Notification messages indicating relationships between a dataset and external resources (e.g. papers or other dataset) that can be used to trigger additional actions, such as the creation of a back-link to provide, for example, bi-directional linking between a published paper and a Dataverse dataset. -- A new capability to provide custom per field instructions in dataset templates - -## Major Use Cases and Infrastructure Enhancements - -Changes and fixes in this release include: - -- Administrators can configure an S3 store used in Dataverse to support users uploading/downloading files via Globus File Transfer. (PR #8891) -- Administrators can configure a RemoteOverlay store to allow files that remain hosted by a remote system to be added to a dataset. (PR #7325) -- Administrators can configure the Dataverse software to send archival Bag copies of published dataset versions to any S3-compatible service. (PR #8751) -- Users can see information about a dataset's parent collection(s) in the OAI-ORE metadata export. (PR #8770) -- Users and administrators can now use the OAI-ORE metadata export to retrieve and assess the fixity of the original file (for ingested tabular files) via the included checksum. (PR #8901) -- Archiving via RDA-conformant Bags is more robust and is more configurable. (PR #8773, #8747, #8699, #8609, #8606, #8610) -- Users and administrators can see the archival status of the versions of the datasets they manage in the dataset page version table. (PR #8748, #8696) -- Administrators can configure messaging between their Dataverse installation and other repositories that may hold related resources or services interested in activity within that installation. (PR #8775) -- Collection managers can create templates that include custom instructions on how to fill out specific metadata fields. - -## Notes for Dataverse Installation Administrators - -### Enabling Experimental Capabilities - -Several of the capabilities introduced in v5.12 are "experimental" in the sense that further changes and enhancements to these capabilities should be expected and that these changes may involve additional work, for those who use the initial implementations, when upgrading to newer versions of the Dataverse software. Administrators wishing to use them are encouraged to stay in touch, e.g. via the Dataverse Community Slack space, to understand the limits of current capabilities and to plan for future upgrades. - -## New JVM Options and DB Settings - -The following DB settings have been added: - -- `:LDNMessageHosts` -- `:GlobusBasicToken` -- `:GlobusEndpoint` -- `:GlobusStores` -- `:GlobusAppUrl` -- `:GlobusPollingInterval` -- `:GlobusSingleFileTransfer` -- `:S3ArchiverConfig` -- `:S3ArchiverProfile` -- `:DRSArchiverConfig` - -See the [Database Settings](https://guides.dataverse.org/en/5.12/installation/config.html#database-settings) section of the Guides for more information. - -## Notes for Developers and Integrators - -See the "Backward Incompatibilities" section below. - -## Backward Incompatibilities - -### OAI-ORE and Archiving Changes - -The Admin API call to manually sumbit a dataset version for archiving has changed to require POST instead of GET and to have a name making it clearer that archiving is being done for a given dataset version: /api/admin/submitDatasetVersionToArchive. - -Earlier versions of the archival bags included the ingested (tab-separated-value) version of tabular files while providing the checksum of the original file (Issue #8449). This release fixes that by including the original file and its metadata in the archival bag. This means that archival bags created prior to this version do not include a way to validate ingested files. Further, it is likely that capabilities in development (i.e. as part of the [Dataverse Uploader](https://github/org/GlobalDataverseCommunityConsortium/dataverse-uploader) to allow re-creation of a dataset version from an archival bag will only be fully compatible with archival bags generated by a Dataverse instance at a release > v5.12. (Specifically, at a minimum, since only the ingested file is included in earlier archival bags, an upload via DVUploader would not result in the same original file/ingested version as in the original dataset.) Administrators should be aware that re-creating archival bags, i.e. via the new batch archiving API, may be advisable now and will be recommended at some point in the future (i.e. there will be a point where we will start versioning archival bags and will start maintaining backward compatibility for older versions as part of transitioning this from being an experimental capability). - -## Complete List of Changes - -## Installation - -If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.12/installation/). Please also contact us to get added to the [Dataverse Project Map](https://guides.dataverse.org/en/5.12/installation/config.html#putting-your-dataverse-installation-on-the-map-at-dataverse-org) if you have not done so already. - -## Upgrade Instructions - -8\. Re-export metadata files (OAI_ORE is affected by the PRs in these release notes). Optionally, for those using the Dataverse software's BagIt-based archiving, re-archive dataset versions archived using prior versions of the Dataverse software. This will be recommended/required in a future release. - -9\. Standard instructions for reinstalling the citation metadatablock. There are no new fields so Solr changes/reindex aren't needed. This PR just adds an option to the list of publicationIdTypes diff --git a/doc/release-notes/8639-computational-workflow.md b/doc/release-notes/8639-computational-workflow.md deleted file mode 100644 index efd5b26e538..00000000000 --- a/doc/release-notes/8639-computational-workflow.md +++ /dev/null @@ -1,8 +0,0 @@ -NOTE: These "workflow" changes should be folded into "Harvard Data Commons Additions" in 8611-DataCommons-related-notes.md - -## Adding Computational Workflow Metadata -The new Computational Workflow metadata block will allow depositors to effectively tag datasets as computational workflows. - -To add the new metadata block, follow the instructions in the user guide: - -The location of the new metadata block tsv file is: `dataverse/scripts/api/data/metadatablocks/computational_workflow.tsv` diff --git a/doc/release-notes/8715-importddi-termofuse.md b/doc/release-notes/8715-importddi-termofuse.md deleted file mode 100644 index 3c6479b8bf9..00000000000 --- a/doc/release-notes/8715-importddi-termofuse.md +++ /dev/null @@ -1 +0,0 @@ -Terms of Use is now imported when using DDI format through harvesting or the native API. (Issue #8715, PR #8743) diff --git a/doc/release-notes/8722-custom-script.md b/doc/release-notes/8722-custom-script.md deleted file mode 100644 index e38987fedca..00000000000 --- a/doc/release-notes/8722-custom-script.md +++ /dev/null @@ -1,3 +0,0 @@ -## New DB Settings - -- :ControlledVocabularyCustomJavaScript \ No newline at end of file diff --git a/doc/release-notes/8727-better-http-range-request-support.md b/doc/release-notes/8727-better-http-range-request-support.md deleted file mode 100644 index dc01ac57dfc..00000000000 --- a/doc/release-notes/8727-better-http-range-request-support.md +++ /dev/null @@ -1,12 +0,0 @@ -### HTTP Range Requests: New HTTP status codes and headers for Datafile Access API - -The Basic File Access resource for datafiles (/api/access/datafile/$id) was slightly modified in order to comply better with the HTTP specification for range requests. - -If the request contains a "Range" header: -* The returned HTTP status is now 206 (Partial Content) instead of 200 -* A "Content-Range" header is returned containing information about the returned bytes -* An "Accept-Ranges" header with value "bytes" is returned - -CORS rules/headers were modified accordingly: -* The "Range" header is added to "Access-Control-Allow-Headers" -* The "Content-Range" and "Accept-Ranges" header are added to "Access-Control-Expose-Headers" diff --git a/doc/release-notes/8740-file-recognition-based-on-filename.md b/doc/release-notes/8740-file-recognition-based-on-filename.md deleted file mode 100644 index 39e3bdc8795..00000000000 --- a/doc/release-notes/8740-file-recognition-based-on-filename.md +++ /dev/null @@ -1,12 +0,0 @@ -### File types detection - -File types are now detected based on the filename when the file has no extension. - -The following filenames are now detected: - -- Makefile=text/x-makefile -- Snakemake=text/x-snakemake -- Dockerfile=application/x-docker-file -- Vagrantfile=application/x-vagrant-file - -These are defined in `MimeTypeDetectionByFileName.properties`. diff --git a/doc/release-notes/8759-add-computational-worflow-file-types.md b/doc/release-notes/8759-add-computational-worflow-file-types.md deleted file mode 100644 index d2db860fe5f..00000000000 --- a/doc/release-notes/8759-add-computational-worflow-file-types.md +++ /dev/null @@ -1,10 +0,0 @@ -NOTE: These "workflow" changes should be folded into "Harvard Data Commons Additions" in 8611-DataCommons-related-notes.md - -The following file extensions are now detected: - -wdl=text/x-workflow-description-language -cwl=text/x-computational-workflow-language -nf=text/x-nextflow -Rmd=text/x-r-notebook -rb=text/x-ruby-script -dag=text/x-dagman diff --git a/doc/release-notes/8868-fix-json-import.md b/doc/release-notes/8868-fix-json-import.md deleted file mode 100644 index de0366e395e..00000000000 --- a/doc/release-notes/8868-fix-json-import.md +++ /dev/null @@ -1,7 +0,0 @@ -Under "bug fixes": - -Small bugs have been fixed in the dataset export in the JSON and DDI formats; eliminating the export of "undefined" as a metadata language in the former, and a duplicate keyword tag in the latter. - -Run ReExportall to update Exports - -Following the directions in the [Admin Guide](http://guides.dataverse.org/en/5.12/admin/metadataexport.html#batch-exports-through-the-api) diff --git a/doc/release-notes/8882-shib-affiliation.md b/doc/release-notes/8882-shib-affiliation.md deleted file mode 100644 index 97d27aa22cc..00000000000 --- a/doc/release-notes/8882-shib-affiliation.md +++ /dev/null @@ -1,4 +0,0 @@ -## New DB Settings -The following DB settings have been added: -- `:ShibAffiliationOrder` - Select the first or last entry in an Affiliation array -- `:ShibAffiliationSeparator` (default: ";") - Set the separator for the Affiliation array diff --git a/doc/release-notes/8947-payara-update.md b/doc/release-notes/8947-payara-update.md deleted file mode 100644 index 28dc70bbeda..00000000000 --- a/doc/release-notes/8947-payara-update.md +++ /dev/null @@ -1,76 +0,0 @@ -# Update to Payara 5.2022.3 highly recommended - -*NOTE: this might be rephrased to "required" depending on https://github.com/IQSS/dataverse/pull/8915 being merged.* - -With lots of bug and security fixes included, we encourage everyone to update to Payara 5.2022.3 as soon as possible. - -**Note:** with the approaching EOL for the Payara 5 Community release train it's likely we will switch to a -yet-to-be-released Payara 6 in the not-so-far-away future. - -We recommend you ensure you followed all update instructions from the past releases regarding Payara. -(latest Payara update was for [v5.6](https://github.com/IQSS/dataverse/releases/tag/v5.6)) - -Upgrading requires a maintenance window and downtime. Please plan ahead, create backups of your database, etc. - -The steps below are a simple matter of reusing your existing domain directory with the new distribution. -But we also recommend that you review the Payara upgrade instructions as it could be helpful during any troubleshooting: -[Payara Release Notes](https://docs.payara.fish/community/docs/Release%20Notes/Release%20Notes%205.2022.3.html) - -If you are running Payara as a non-root user (and you should be!), remember not to execute the commands below as root. -Use `sudo` to change to that user first. For example, `sudo -i -u dataverse` if `dataverse` is your dedicated -application user. - -In the following commands we assume that Payara 5 is installed in `/usr/local/payara5`. If not, adjust as needed. - -Please note that the deletion of the `lib/databases` directory below is only required once, for this upgrade (see Issue #8230 for details). - -```shell -export PAYARA=/usr/local/payara5 -``` - -(or `setenv PAYARA /usr/local/payara5` if you are using a `csh`-like shell) - -1. Undeploy the previous version - -```shell - $PAYARA/bin/asadmin list-applications - $PAYARA/bin/asadmin undeploy dataverse<-version> -``` - -2. Stop Payara - -```shell - service payara stop - rm -rf $PAYARA/glassfish/domains/domain1/generated - rm -rf $PAYARA/glassfish/domains/domain1/osgi-cache - rm -rf $PAYARA/glassfish/domains/domain1/lib/databases -``` - -3. Move the current Payara directory out of the way - -```shell - mv $PAYARA $PAYARA.MOVED -``` - -4. Download the new Payara version (5.2022.3), and unzip it in its place - -5. Replace the brand new payara/glassfish/domains/domain1 with your old, preserved domain1 - -6. Start Payara - -```shell - service payara start -``` - -7. Deploy this version. - -```shell - $PAYARA/bin/asadmin deploy dataverse-5.12.war -``` - -8. Restart payara - -```shell - service payara stop - service payara start -``` From caa43e4ff515c9597a0ef3df0a00e18f7db31dc6 Mon Sep 17 00:00:00 2001 From: landreev Date: Wed, 28 Sep 2022 17:22:52 -0400 Subject: [PATCH 09/10] Update 5.12-release-notes.md Small change to the metadata reexport instructions at the end of the release note. --- doc/release-notes/5.12-release-notes.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index 100755d3588..4f69ca85133 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -232,6 +232,4 @@ Update the Citation metadata block: - `wget https://github.com/IQSS/dataverse/releases/download/v5.12/citation.tsv` - `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"` -- Re-export metadata files (OAI_ORE is affected by the PRs in these release notes). Optionally, for those using the Dataverse software's BagIt-based archiving, re-archive dataset versions archived using prior versions of the Dataverse software. This will be recommended/required in a future release. - -- Run ReExportall to update Exports +- Run ReExportAll to update metadata files (OAI_ORE, JSON and DDI formats are affected by the changes and bug fixes in this release; PRs #8770 and #8868). Optionally, for those using the Dataverse software's BagIt-based archiving, re-archive dataset versions archived using prior versions of the Dataverse software. This will be recommended/required in a future release. From da4d835c9701aa5a325d7f0717be735c6b5ed3d3 Mon Sep 17 00:00:00 2001 From: Philip Durbin Date: Thu, 29 Sep 2022 10:43:19 -0400 Subject: [PATCH 10/10] various fixes and changes #8979 - highlight Globus and other big features at top - added missing snippet: New Computational Workflow Metadata Block - fix typos, etc. --- doc/release-notes/5.12-release-notes.md | 66 ++++++++++++++++--------- 1 file changed, 43 insertions(+), 23 deletions(-) diff --git a/doc/release-notes/5.12-release-notes.md b/doc/release-notes/5.12-release-notes.md index 4f69ca85133..7085f859046 100644 --- a/doc/release-notes/5.12-release-notes.md +++ b/doc/release-notes/5.12-release-notes.md @@ -4,6 +4,31 @@ This release brings new features, enhancements, and bug fixes to the Dataverse S ## Release Highlights +### Support for Globus + +[Globus][] can be used to transfer large files. Part of "Harvard Data Commons Additions" below. + +[Globus]: https://www.globus.org + +### Support for Remote File Storage + +Dataset files can be stored at remote URLs. Part of "Harvard Data Commons Additions" below. + +### New Computational Workflow Metadata Block + +The new Computational Workflow metadata block will allow depositors to effectively tag datasets as computational workflows. + +To add the new metadata block, follow the instructions in the Admin Guide: + +The location of the new metadata block tsv file is `scripts/api/data/metadatablocks/computational_workflow.tsv`. Part of "Harvard Data Commons Additions" below. + +### Support for Linked Data Notifications (LDN) + +[Linked Data Notifications][] (LDN) is a standard from the W3C. Part of "Harvard Data Commons Additions" below. + +[Linked Data Notifications]: https://www.w3.org/TR/ldn/ + + ### Harvard Data Commons Additions As reported at the 2022 Dataverse Community Meeting, the [Harvard Data Commons](https://sites.harvard.edu/harvard-data-commons/) project has supported a wide range of additions to the Dataverse software that improve support for Big Data, Workflows, Archiving, and interaction with other repositories. In many cases, these additions build upon features developed within the Dataverse community by Borealis, DANS, QDR, TDL, and others. Highlights from this work include: @@ -19,18 +44,14 @@ As reported at the 2022 Dataverse Community Meeting, the [Harvard Data Commons]( - Initial support for sending and receiving Linked Data Notification messages indicating relationships between a dataset and external resources (e.g. papers or other dataset) that can be used to trigger additional actions, such as the creation of a back-link to provide, for example, bi-directional linking between a published paper and a Dataverse dataset. - A new capability to provide custom per field instructions in dataset templates - The following file extensions are now detected: - wdl=text/x-workflow-description-language - cwl=text/x-computational-workflow-language - nf=text/x-nextflow - Rmd=text/x-r-notebook - rb=text/x-ruby-script - dag=text/x-dagman - -### Update to Payara 5.2022.3 highly recommended + - wdl=text/x-workflow-description-language + - cwl=text/x-computational-workflow-language + - nf=text/x-nextflow + - Rmd=text/x-r-notebook + - rb=text/x-ruby-script + - dag=text/x-dagman -With lots of bug and security fixes included, we encourage everyone to update to Payara 5.2022.3 as soon as possible. - -### Improvements to fields that appear in the Citation metadata block +### Improvements to Fields that Appear in the Citation Metadata Block Grammar, style and consistency improvements have been made to the titles, tooltip description text, and watermarks of metadata fields that appear in the Citation metadata block. @@ -40,24 +61,24 @@ The changes should provide clearer information to curators, depositors, and peop A new page in the Style Guides called "Text" has also been added. The new page includes a section called "Metadata Text Guidelines" with a link to a Google Doc where the guidelines are being maintained for now since we expect them to be revised frequently. -### Adding new static search facet: Metadata Types +### New Static Search Facet: Metadata Types A new static search facet has been added to the search side panel. This new facet is called "Metadata Types" and is driven from metadata blocks. When a metadata field value is inserted into a dataset, an entry for the metadata block it belongs to is added to this new facet. This new facet needs to be configured for it to appear on the search side panel. The configuration assigns to a dataverse what metadata blocks to show. The configuration is inherited by child dataverses. -To configure the new facet, use the Metadata Block Facet API: +To configure the new facet, use the Metadata Block Facet API: ### Broader MicroProfile Config Support for Developers -As of this release, many [JVM options](https://guides.dataverse.org/en/latest/installation/config.html#jvm-options) +As of this release, many [JVM options](https://guides.dataverse.org/en/5.12/installation/config.html#jvm-options) can be set using any [MicroProfile Config Source](https://docs.payara.fish/community/docs/Technical%20Documentation/MicroProfile/Config/Overview.html#config-sources). -Currently this change is only relevant to developers but as settings are migrated to the new "lookup" pattern documented in the [Consuming Configuration](https://guides.dataverse.org/en/latest/developers/configuration.html) section of the Developer Guide, anyone installing the Dataverse software will have much greater flexibility when configuring those settings, especially within containers. These changes will be announced in future releases. +Currently this change is only relevant to developers but as settings are migrated to the new "lookup" pattern documented in the [Consuming Configuration](https://guides.dataverse.org/en/5.12/developers/configuration.html) section of the Developer Guide, anyone installing the Dataverse software will have much greater flexibility when configuring those settings, especially within containers. These changes will be announced in future releases. Please note that an upgrade to Payara 5.2021.8 or higher is required to make use of this. Payara 5.2021.5 threw exceptions, as explained in PR #8823. -### HTTP Range Requests: New HTTP status codes and headers for Datafile Access API +### HTTP Range Requests: New HTTP Status Codes and Headers for Datafile Access API The Basic File Access resource for datafiles (/api/access/datafile/$id) was slightly modified in order to comply better with the HTTP specification for range requests. @@ -70,7 +91,7 @@ CORS rules/headers were modified accordingly: * The "Range" header is added to "Access-Control-Allow-Headers" * The "Content-Range" and "Accept-Ranges" header are added to "Access-Control-Expose-Headers" -### File types detection +### File Type Detection When File Has No Extension File types are now detected based on the filename when the file has no extension. @@ -83,6 +104,10 @@ The following filenames are now detected: These are defined in `MimeTypeDetectionByFileName.properties`. +### Upgrade to Payara 5.2022.3 Highly Recommended + +With lots of bug and security fixes included, we encourage everyone to upgrade to Payara 5.2022.3 as soon as possible. See below for details. + ## Major Use Cases and Infrastructure Enhancements Changes and fixes in this release include: @@ -142,7 +167,6 @@ The Admin API call to manually submit a dataset version for archiving has change Earlier versions of the archival bags included the ingested (tab-separated-value) version of tabular files while providing the checksum of the original file (Issue #8449). This release fixes that by including the original file and its metadata in the archival bag. This means that archival bags created prior to this version do not include a way to validate ingested files. Further, it is likely that capabilities in development (i.e. as part of the [Dataverse Uploader](https://github/org/GlobalDataverseCommunityConsortium/dataverse-uploader) to allow re-creation of a dataset version from an archival bag will only be fully compatible with archival bags generated by a Dataverse instance at a release > v5.12. (Specifically, at a minimum, since only the ingested file is included in earlier archival bags, an upload via DVUploader would not result in the same original file/ingested version as in the original dataset.) Administrators should be aware that re-creating archival bags, i.e. via the new batch archiving API, may be advisable now and will be recommended at some point in the future (i.e. there will be a point where we will start versioning archival bags and will start maintaining backward compatibility for older versions as part of transitioning this from being an experimental capability). - ## Installation If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.12/installation/). Please also contact us to get added to the [Dataverse Project Map](https://guides.dataverse.org/en/5.10/installation/config.html#putting-your-dataverse-installation-on-the-map-at-dataverse-org) if you have not done so already. @@ -155,10 +179,7 @@ If you are running Payara as a non-root user (and you should be!), **remember no In the following commands we assume that Payara 5 is installed in `/usr/local/payara5`. If not, adjust as needed. - -# Update to Payara 5.2022.3 highly recommended - -With lots of bug and security fixes included, we encourage everyone to update to Payara 5.2022.3 as soon as possible. +### Instructions for Upgrading to Payara 5.2022.3 **Note:** with the approaching EOL for the Payara 5 Community release train it's likely we will switch to a yet-to-be-released Payara 6 in the not-so-far-away future. @@ -172,7 +193,6 @@ The steps below are a simple matter of reusing your existing domain directory wi But we also recommend that you review the Payara upgrade instructions as it could be helpful during any troubleshooting: [Payara Release Notes](https://docs.payara.fish/community/docs/Release%20Notes/Release%20Notes%205.2022.3.html) - Please note that the deletion of the `lib/databases` directory below is only required once, for this upgrade (see Issue #8230 for details). ```shell