Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update #27

Merged
merged 118 commits into from
Jul 15, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
118 commits
Select commit Hold shift + click to select a range
3d4caf0
update
lubitchv Mar 26, 2019
2c0e9f6
test
lubitchv Mar 26, 2019
6dd7431
Merge pull request #35 from lubitchv/4448-api-edit-var-meta
lubitchv Mar 26, 2019
972b2e9
dct_explore
lubitchv Mar 26, 2019
56b5def
Multiple files fix
lubitchv Apr 3, 2019
f3a0e50
multiple files
lubitchv Apr 10, 2019
dadbcbe
remove comments
lubitchv Apr 10, 2019
eee0be2
Merge branch '4448-api-edit-var-meta' into 4448-api-edit-var-meta
lubitchv Apr 10, 2019
530454c
Merge pull request #36 from lubitchv/4448-api-edit-var-meta
lubitchv Apr 10, 2019
38d496f
Testing mail
lubitchv Apr 10, 2019
af5f5fd
try catch for sending email API Token
lubitchv Apr 10, 2019
205ab1d
Postqtxt
lubitchv Apr 12, 2019
e105bdb
Merge branch 'develop' into 4448-api-edit-var-meta
lubitchv Apr 12, 2019
916928a
add postq
lubitchv Apr 12, 2019
120d5b3
Merge branch '4448-api-edit-var-meta' of https://github.com/lubitchv/…
lubitchv Apr 12, 2019
a71ea3f
Merge pull request #11 from IQSS/develop
lubitchv Apr 12, 2019
b4b50e1
postQ
lubitchv Apr 12, 2019
d4725e0
remove dct
lubitchv Apr 12, 2019
0eadfa8
Post question
lubitchv Apr 15, 2019
d725fcc
Merge branch '4448-api-edit-var-meta' of https://github.com/lubitchv/…
lubitchv Apr 15, 2019
31b088d
PostQ
lubitchv Apr 18, 2019
48bef68
VM
lubitchv Apr 18, 2019
f00e415
update from lubitchv
lubitchv Apr 22, 2019
7ed4cd7
Merge branch 'lubitchv-4448-api-edit-var-meta' into 4448-api-edit-var…
lubitchv Apr 22, 2019
ce8e720
add dct
lubitchv Apr 22, 2019
2f4e2c9
remove PostQ sql
lubitchv Apr 22, 2019
eb31e93
add postQ sql again
lubitchv Apr 22, 2019
3092d94
outoforder
lubitchv Apr 23, 2019
7960982
update 4.13
lubitchv Apr 23, 2019
b403560
Merge pull request #40 from IQSS/develop
lubitchv Apr 23, 2019
0b25615
outOfOrder
lubitchv Apr 23, 2019
5a18d33
Merge branch '4448-api-edit-var-meta' of https://github.com/scholarsp…
lubitchv Apr 23, 2019
73aff97
4.12.02
lubitchv Apr 23, 2019
a018012
rename migration sql
lubitchv Apr 23, 2019
d6ba0af
upgrade 4.13
lubitchv Apr 23, 2019
20550f4
Merge branch 'IQSS-develop' into 4448-api-edit-var-meta
lubitchv Apr 23, 2019
2448780
Show changes variable metadata
lubitchv May 2, 2019
929b7bf
Merge branch '4448-api-edit-var-meta' into 4448-api-edit-var-meta
lubitchv May 2, 2019
85d3374
VariableMetadata to Variable Metadata in Bundle
lubitchv May 2, 2019
c3a4835
Merge branch '4448-api-edit-var-meta' of https://github.com/lubitchv/…
lubitchv May 2, 2019
00e44b7
merge
lubitchv May 3, 2019
9ca499e
Merge branch 'IQSS-develop' into 4448-api-edit-var-meta
lubitchv May 3, 2019
ed499ba
new dct angular 7
lubitchv May 3, 2019
532b69f
Post Q bug
lubitchv May 6, 2019
835be31
dct ui changes
lubitchv May 8, 2019
18a6a52
dct ui add desc header
lubitchv May 14, 2019
8f895ff
dct ui add filename and citation
lubitchv May 14, 2019
eca893f
dct ui Add grp edit
lubitchv May 15, 2019
a157843
dct ui add Grp focus
lubitchv May 15, 2019
721c338
merge
lubitchv May 21, 2019
2c0db82
Merge branch 'IQSS-develop' into 4448-api-edit-var-meta
lubitchv May 21, 2019
d7d8154
Clean up
lubitchv May 21, 2019
976af5f
Clean up
lubitchv May 21, 2019
b4c3d64
Update last version
lubitchv Jun 3, 2019
ed511c2
add dct
lubitchv Jun 4, 2019
98f6c76
clean up
lubitchv Jun 4, 2019
8e6a25b
dct ui frequency format
lubitchv Jun 5, 2019
940fa14
Local BagIt Archive Command Added
andreyodum Jun 7, 2019
678928d
UI dct Kaitlin
lubitchv Jun 11, 2019
ee3eafa
Fix integration test
lubitchv Jun 14, 2019
49bd552
update from develop
lubitchv Jun 14, 2019
99596dd
update
lubitchv Jun 14, 2019
302d779
Merge branch 'IQSS-develop' into 4448-api-edit-var-meta
lubitchv Jun 14, 2019
923e5f3
BagIt Docs update
andreyodum Jun 19, 2019
79ed5fb
Merge pull request #1 from andreyodum/5850_bagit_export2_local_fs
donsizemore Jun 19, 2019
e259a5d
sphinx doc var edit api
lubitchv Jun 25, 2019
62ae753
remove dct
lubitchv Jun 25, 2019
9d004b1
update headers
lubitchv Jun 25, 2019
0b4c5b5
update headers and out of order
lubitchv Jun 25, 2019
097f74a
Merge pull request #22 from IQSS/develop
lubitchv Jun 25, 2019
c93da2e
Updating + clarifying metadata fields when creating a dataverse [#5965]
dlmurphy Jun 25, 2019
aac4c9b
hederse and fly 4.12.02 del
lubitchv Jun 25, 2019
10f80be
fly 4.12.01__4.13
lubitchv Jun 25, 2019
9fddfc0
Filter tags and File Types appropriately when spaces are present
djbrooke Jun 26, 2019
55f22c9
Update documentation for Dataverse content and file size info retriev…
j-n-c Jun 26, 2019
e86d907
My Data page - fixing inaccuracies, explaining confusion [#5965]
dlmurphy Jun 26, 2019
ba83948
Adding documentation about editing variable metadata
djbrooke Jun 26, 2019
c9cca81
Added links to docs [#4448]
dlmurphy Jun 26, 2019
6be708c
Merge branch 'develop' into 5972-respecting-spaces
sekmiller Jun 27, 2019
458ac91
restore query (accidentally?) removed PR #5863 #5978
pdurbin Jun 27, 2019
c344b3b
#5981 check for empty value array
sekmiller Jun 27, 2019
7672620
Update file handling section [#5965]
dlmurphy Jun 27, 2019
a6e0b4a
Merge branch 'develop' into 4448-api-edit-var-meta-pdurbin-code-revie…
pdurbin Jun 28, 2019
212f080
add EditDDIIT to API test suite #4448
pdurbin Jun 28, 2019
3f19614
link to dct.xml example from docs #4448
pdurbin Jun 28, 2019
ed94780
pretty print XML with `xmlllint -format` #4448
pdurbin Jun 28, 2019
b1d8b8a
add Data Curation Tool to list of external tools #4448
pdurbin Jun 28, 2019
2619abd
Merge pull request #23 from IQSS/4448-api-edit-var-meta-pdurbin-code-…
lubitchv Jun 28, 2019
5a4ac8f
#5850 tidy up RST
donsizemore Jul 1, 2019
14c20f8
#5850 tidy up RST
donsizemore Jul 1, 2019
f7fe5e4
add sub headings to table of contents #5850
pdurbin Jul 1, 2019
67b8dbe
Add LimitNPROC in solr systemd sample config (#5993)
jri-sp Jul 4, 2019
6fcaeb9
Merge pull request #25 from lubitchv/4448-api-edit-var-meta
lubitchv Jul 5, 2019
fe5f49b
Add export DDI update
lubitchv Jul 8, 2019
b67eb84
4.15.1 Patch - get solr server from search service bean
sekmiller Jul 10, 2019
0f8974c
Merge branch 'develop' of https://github.com/IQSS/dataverse into 5850…
donsizemore Jul 10, 2019
794d091
Merge pull request #5973 from IQSS/5972-respecting-spaces
kcondon Jul 10, 2019
11eba29
Merge pull request #5980 from IQSS/5978-merge-accounts-broken
kcondon Jul 10, 2019
ca4d545
Merge pull request #5983 from IQSS/5965-doc-curation-cleanup
kcondon Jul 10, 2019
13533e0
Merge pull request #5984 from OdumInstitute/5850_bagit_export2_local_fs
kcondon Jul 10, 2019
375587e
Merge pull request #5995 from jri-sp/5993-systemd-LimitNPROC
kcondon Jul 10, 2019
dcb6ca1
Merge pull request #5971 from lubitchv/4448-api-edit-var-meta
kcondon Jul 10, 2019
1b073e3
Merge pull request #5982 from IQSS/5981-index-fails-for-empty-date-fi…
kcondon Jul 10, 2019
2d18d18
Merge pull request #6004 from IQSS/4.15.1-patch-for-SolrServer
kcondon Jul 10, 2019
d6c2b72
Update conf.py
kcondon Jul 10, 2019
68af48d
Update versions.rst
kcondon Jul 10, 2019
224606b
Update pom.xml
kcondon Jul 10, 2019
17ad1ee
Merge pull request #6009 from IQSS/4.15.1
kcondon Jul 10, 2019
3d34067
remove dct
lubitchv Jul 11, 2019
5720b5d
Merge pull request #26 from IQSS/develop
lubitchv Jul 11, 2019
884947e
Merge pull request #6011 from j-n-c/j-n-c-patch-1
kcondon Jul 11, 2019
4940e6b
Merge pull request #6013 from lubitchv/6001-update-export-ddi-dataset
kcondon Jul 11, 2019
1231607
Issue #6017: Rework Quick Fix procedure for documentation contributions
j-n-c Jul 12, 2019
6ad29ea
Issue #6017: Rework Quick Fix procedure for documentation contributio…
j-n-c Jul 12, 2019
6c15482
simplify the quick fix steps #6017
pdurbin Jul 12, 2019
bd1fd16
Merge pull request #2 from pdurbin/6017-docs
j-n-c Jul 15, 2019
ac9cd79
Fix typo
j-n-c Jul 15, 2019
e0a8a7a
Merge pull request #6019 from j-n-c/6017-documentation_quick_fix_proc…
kcondon Jul 15, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion conf/docker-aio/run-test-suite.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ fi

# Please note the "dataverse.test.baseurl" is set to run for "all-in-one" Docker environment.
# TODO: Rather than hard-coding the list of "IT" classes here, add a profile to pom.xml.
mvn test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT -Ddataverse.test.baseurl=$dvurl
mvn test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT -Ddataverse.test.baseurl=$dvurl
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ WorkingDirectory = /usr/local/solr/solr-7.3.1
ExecStart = /usr/local/solr/solr-7.3.1/bin/solr start -m 1g
ExecStop = /usr/local/solr/solr-7.3.1/bin/solr stop
LimitNOFILE=65000
LimitNPROC=65000
Restart=on-failure

[Install]
Expand Down
19 changes: 17 additions & 2 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,15 +64,15 @@ Show Contents of a Dataverse

|CORS| Lists all the DvObjects under dataverse ``id``. ::

GET http://$SERVER/api/dataverses/$id/contents
``curl -H "X-Dataverse-key:$API_TOKEN" http://$SERVER_URL/api/dataverses/$id/contents``


Report the data (file) size of a Dataverse
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Shows the combined size in bytes of all the files uploaded into the dataverse ``id``. ::

GET http://$SERVER/api/dataverses/$id/storagesize
``curl -H "X-Dataverse-key:$API_TOKEN" http://$SERVER_URL/api/dataverses/$id/storagesize``

Both published and unpublished files will be counted, in the dataverse specified, and in all its sub-dataverses, recursively.
By default, only the archival files are counted - i.e., the files uploaded by users (plus the tab-delimited versions generated for tabular data files on ingest). If the optional argument ``includeCached=true`` is specified, the API will also add the sizes of all the extra files generated and cached by Dataverse - the resized thumbnail versions for image files, the metadata exports for published datasets, etc.
Expand Down Expand Up @@ -812,6 +812,19 @@ Example::

Also note that dataFileTags are not versioned and changes to these will update the published version of the file.

.. _EditingVariableMetadata:

Editing Variable Level Metadata
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Updates variable level metadata using ddi xml ``$file``, where ``$id`` is file id::

PUT https://$SERVER/api/edit/$id --upload-file $file

Example: ``curl -H "X-Dataverse-key:$API_TOKEN" -X PUT http://localhost:8080/api/edit/95 --upload-file dct.xml``

You can download :download:`dct.xml <../../../../src/test/resources/xml/dct.xml>` from the example above to see what the XML looks like.

Provenance
~~~~~~~~~~
Get Provenance JSON for an uploaded file::
Expand Down Expand Up @@ -1472,3 +1485,5 @@ Recursively applies the role assignments of the specified dataverse, for the rol
GET http://$SERVER/api/admin/dataverse/{dataverse alias}/addRoleAssignmentsToChildren

Note: setting ``:InheritParentRoleAssignments`` will automatically trigger inheritance of the parent dataverse's role assignments for a newly created dataverse. Hence this API call is intended as a way to update existing child dataverses or to update children after a change in role assignments has been made on a parent dataverse.


4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@
# built documents.
#
# The short X.Y version.
version = '4.15'
version = '4.15.1'
# The full version, including alpha/beta/rc tags.
release = '4.15'
release = '4.15.1'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
43 changes: 34 additions & 9 deletions doc/sphinx-guides/source/developers/documentation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,20 +8,41 @@ Documentation
Quick Fix
-----------

If you find a typo or a small error in the documentation you can easily fix it using GitHub.
If you find a typo or a small error in the documentation you can fix it using GitHub's online web editor. Generally speaking, we will be following https://help.github.com/en/articles/editing-files-in-another-users-repository

- Fork the repository
- Go to [your GitHub username]/dataverse/doc/sphinx-guides/source and access the file you would like to fix
- Switch to the branch that is currently under development
- Click the Edit button in the upper-right corner and fix the error
- Submit a pull request
- Navigate to https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source where you will see folders for each of the guides:

- `admin`_
- `api`_
- `developers`_
- `installation`_
- `user`_

- Find the file you want to edit under one of the folders above.
- Click the pencil icon in the upper-right corner. If this is your first contribution to Dataverse, the hover text over the pencil icon will say "Fork this project and edit this file".
- Make changes to the file and preview them.
- In the **Commit changes** box, enter a description of the changes you have made and click **Propose file change**.
- Under the **Write** tab, delete the long welcome message and write a few words about what you fixed.
- Click **Create Pull Request**.

That's it! Thank you for your contribution! Your pull request will be added manually to the main Dataverse project board at https://github.com/orgs/IQSS/projects/2 and will go through code review and QA before it is merged into the "develop" branch. Along the way, developers might suggest changes or make them on your behalf. Once your pull request has been merged you will be listed as a contributor at https://github.com/IQSS/dataverse/graphs/contributors

Please see https://github.com/IQSS/dataverse/pull/5857 for an example of a quick fix that was merged (the "Files changed" tab shows how a typo was fixed).

If you would like to read more about the Dataverse project's use of GitHub, please see the :doc:`version-control` section. For bug fixes and features we request that you create an issue before making a pull request but this is not at all necessary for quick fixes to the documentation.

.. _admin: https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/admin
.. _api: https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/api
.. _developers: https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/developers
.. _installation: https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/installation
.. _user: https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/user

Other Changes (Sphinx)
----------------------

The documentation for Dataverse was written using Sphinx (http://sphinx-doc.org/).
If you are interested in suggesting changes or updates we recommend that you create
the html files using Sphinx locally and the submit a pull request through GitHub. Here are the instructions on how to proceed:
the html files using Sphinx locally and then submit a pull request through GitHub. Here are the instructions on how to proceed:


Installing Sphinx
Expand Down Expand Up @@ -50,7 +71,11 @@ Using Sphinx

First, you will need to make a fork of the dataverse repository in GitHub. Then, you will need to make a clone of your fork so you can manipulate the files outside GitHub.

To edit the existing documentation go to ~/dataverse/doc/sphinx-guides/source directory inside your clone. There, you will find the .rst files that correspond to the guides in the dataverse page (http://guides.dataverse.org/en/latest/user/index.html). Now, using your preferred text editor, open and edit these files, or create new .rst files and edit the others accordingly.
To edit the existing documentation:

- Create a branch (refer to http://guides.dataverse.org/en/latest/developers/version-control.html > *Create a New Branch off the develop Branch*) to record the changes you are about to perform.
- Go to ~/dataverse/doc/sphinx-guides/source directory inside your clone. There, you will find the .rst files that correspond to the guides in the dataverse page (http://guides.dataverse.org/en/latest/).
- Using your preferred text editor, open and edit the necessary files, or create new ones.

Once you are done, open a terminal and change directories to ~/dataverse/doc/sphinx-guides . Then, run the following commands:

Expand All @@ -61,7 +86,7 @@ Once you are done, open a terminal and change directories to ~/dataverse/doc/sph
After sphinx is done processing the files you should notice that the html folder in ~/dataverse/doc/sphinx-guides/build directory has been updated.
You can click on the files in the html folder to preview the changes.

Now you can make a commit with the changes to your own fork in GitHub and submit a pull request to the dataverse repository.
Now you can make a commit with the changes to your own fork in GitHub and submit a pull request to the original (upstream) dataverse repository.

Table of Contents
-----------------
Expand Down
40 changes: 32 additions & 8 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -658,16 +658,19 @@ For Google Analytics, the example script at :download:`analytics-code.html </_st

Once this script is running, you can look in the Google Analytics console (Realtime/Events or Behavior/Events) and view events by type and/or the Dataset or File the event involves.

DuraCloud/Chronopolis Integration
---------------------------------
BagIt Export
------------

It's completely optional to integrate your installation of Dataverse with DuraCloud/Chronopolis but the details are listed here to keep the :doc:`/admin/integrations` section of the Admin Guide shorter.
Dataverse may be configured to submit a copy of published Datasets, packaged as `Research Data Alliance conformant <https://www.rd-alliance.org/system/files/Research%20Data%20Repository%20Interoperability%20WG%20-%20Final%20Recommendations_reviewed_0.pdf>`_ zipped `BagIt <https://tools.ietf.org/html/draft-kunze-bagit-17>`_ bags to `Chronopolis <https://libraries.ucsd.edu/chronopolis/>`_ via `DuraCloud <https://duraspace.org/duracloud/>`_ or alternately to any folder on the local filesystem.

Dataverse can be configured to submit a copy of published Datasets, packaged as `Research Data Alliance conformant <https://www.rd-alliance.org/system/files/Research%20Data%20Repository%20Interoperability%20WG%20-%20Final%20Recommendations_reviewed_0.pdf>`_ zipped `BagIt <https://tools.ietf.org/html/draft-kunze-bagit-17>`_ bags to the `Chronopolis <https://libraries.ucsd.edu/chronopolis/>`_ via `DuraCloud <https://duraspace.org/duracloud/>`_
Dataverse offers an internal archive workflow which may be configured as a PostPublication workflow via an admin API call to manually submit previously published Datasets and prior versions to a configured archive such as Chronopolis. The workflow creates a `JSON-LD <http://www.openarchives.org/ore/0.9/jsonld>`_ serialized `OAI-ORE <https://www.openarchives.org/ore/>`_ map file, which is also available as a metadata export format in the Dataverse web interface.

This integration is occurs through customization of an internal Dataverse archiver workflow that can be configured as a PostPublication workflow to submit the bag to Chronopolis' Duracloud interface using your organization's credentials. An admin API call exists that can manually submit previously published Datasets, and prior versions, to a configured archive such as Chronopolis. The workflow leverages new functionality in Dataverse to create a `JSON-LD <http://www.openarchives.org/ore/0.9/jsonld>`_ serialized `OAI-ORE <https://www.openarchives.org/ore/>`_ map file, which is also available as a metadata export format in the Dataverse web interface.
At present, the DPNSubmitToArchiveCommand and LocalSubmitToArchiveCommand are the only implementations extending the AbstractSubmitToArchiveCommand and using the configurable mechanisms discussed below.

At present, the DPNSubmitToArchiveCommand is the only implementation extending the AbstractSubmitToArchiveCommand and using the configurable mechanisms discussed below.
.. _Duracloud Configuration:

Duracloud Configuration
+++++++++++++++++++++++

Also note that while the current Chronopolis implementation generates the bag and submits it to the archive's DuraCloud interface, the step to make a 'snapshot' of the space containing the Bag (and verify it's successful submission) are actions a curator must take in the DuraCloud interface.

Expand Down Expand Up @@ -695,7 +698,27 @@ Archivers may require glassfish settings as well. For the Chronopolis archiver,

``./asadmin create-jvm-options '-Dduracloud.password=YOUR_PASSWORD_HERE'``

**API Call**
.. _Local Path Configuration:

Local Path Configuration
++++++++++++++++++++++++

ArchiverClassName - the fully qualified class to be used for archiving. For example\:

``curl -X PUT -d "edu.harvard.iq.dataverse.engine.command.impl.LocalSubmitToArchiveCommand" http://localhost:8080/api/admin/settings/:ArchiverClassName``

\:BagItLocalPath - the path to where you want to store BagIt. For example\:

``curl -X PUT -d /home/path/to/storage http://localhost:8080/api/admin/settings/:BagItLocalPath``

\:ArchiverSettings - the archiver class can access required settings including existing Dataverse settings and dynamically defined ones specific to the class. This setting is a comma-separated list of those settings. For example\:

``curl http://localhost:8080/api/admin/settings/:ArchiverSettings -X PUT -d ":BagItLocalPath”``

:BagItLocalPath is the file path that you've set in :ArchiverSettings.

API Call
++++++++

Once this configuration is complete, you, as a user with the *PublishDataset* permission, should be able to use the API call to manually submit a DatasetVersion for processing:

Expand All @@ -711,7 +734,8 @@ The submitDataVersionToArchive API (and the workflow discussed below) attempt to

In the Chronopolis case, since the transfer from the DuraCloud front-end to archival storage in Chronopolis can take significant time, it is currently up to the admin/curator to submit a 'snap-shot' of the space within DuraCloud and to monitor its successful transfer. Once transfer is complete the space should be deleted, at which point the Dataverse API call can be used to submit a Bag for other versions of the same Dataset. (The space is reused, so that archival copies of different Dataset versions correspond to different snapshots of the same DuraCloud space.).

**PostPublication Workflow**
PostPublication Workflow
++++++++++++++++++++++++

To automate the submission of archival copies to an archive as part of publication, one can setup a Dataverse Workflow using the "archiver" workflow step - see the :doc:`/developers/workflows` guide.
. The archiver step uses the configuration information discussed above including the :ArchiverClassName setting. The workflow step definition should include the set of properties defined in \:ArchiverSettings in the workflow definition.
Expand Down
2 changes: 2 additions & 0 deletions doc/sphinx-guides/source/installation/external-tools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ Support for external tools is just getting off the ground but the following tool

- `File Previewers <https://github.com/QualitativeDataRepository/dataverse-previewers>`_: A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>` annotations, images, PDF, text, video - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file.

- Data Curation Tool: a GUI for curating data by adding labels, groups, weights and other details to assist with informed reuse. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Curation-Tool for the installation instructions.

- [Your tool here! Please get in touch! :) ]


Expand Down
6 changes: 5 additions & 1 deletion doc/sphinx-guides/source/user/account.rst
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,11 @@ You can also convert your Dataverse account to use authentication provided by Gi
My Data
-------

The My Data section of your account page displays a listing of all the dataverses, datasets, and files you have either created, uploaded or that you have access to edit. You are able to filter through all the dataverses, datasets, and files listed there using the filter box. You may also use the facets on the left side to only view a specific Publication Status or Role.
The My Data section of your account page displays a listing of all the dataverses, datasets, and files you have either created, uploaded or that you have a role assigned on. If you see unexpected dataverses or datasets in your My Data page, it might be because someone has assigned your account a role on those dataverses or datasets. For example, some institutions automatically assign the "File Downloader" role on their datasets to all accounts using their institutional login.


You are able to filter through all the dataverses, datasets, and files listed on your My Data page using the filter box. You may also use the facets on the left side to only view a specific Publication Status or Role.


Notifications
-------------
Expand Down
7 changes: 6 additions & 1 deletion doc/sphinx-guides/source/user/dataset-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ If there are multiple upload options available, then you must choose which one t

You can upload files to a dataset while first creating that dataset. You can also upload files after creating a dataset by clicking the "Edit" button at the top of the dataset page and from the dropdown list selecting "Files (Upload)" or clicking the "Upload Files" button above the files table in the Files tab. From either option you will be brought to the Upload Files page for that dataset.

Certain file types in Dataverse are supported by additional functionality, which can include downloading in different formats, subsets, file-level metadata preservation, file-level data citation; and exploration through data visualization and analysis. See the File Handling section of this page for more information.
Certain file types in Dataverse are supported by additional functionality, which can include downloading in different formats, subsets, file-level metadata preservation, file-level data citation with UNFs, and exploration through data visualization and analysis. See the File Handling section of this page for more information.


HTTP Upload
Expand Down Expand Up @@ -229,6 +229,11 @@ You will not have to leave the dataset page to complete these action, except for

If you restrict files, you will also prompted with a popup asking you to fill out the Terms of Access for the files. If Terms of Access already exist, you will be asked to confirm them. Note that some Dataverse installations do not allow for file restrictions.

Edit File Variable Metadata
---------------------------

Variable Metadata can be edited directly through an API call (:ref:`API Guide: Editing Variable Level Metadata <EditingVariableMetadata>`) or by using the `Dataverse Data Curation Tool <https://github.com/scholarsportal/Dataverse-Data-Curation-Tool>`_.

File Path
---------

Expand Down
Loading