Skip to content

Commit

Permalink
Merge branch 'develop' into 3637-navbar #3637
Browse files Browse the repository at this point in the history
Conflicts:
src/main/java/edu/harvard/iq/dataverse/DatasetPage.java

Just some Inject's at the top were in conflict.
  • Loading branch information
pdurbin committed Jun 2, 2017
2 parents 32a90dd + a9c5354 commit 1b763c6
Show file tree
Hide file tree
Showing 18 changed files with 703 additions and 279 deletions.
47 changes: 37 additions & 10 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -110,31 +110,58 @@ Publishing the Root Dataverse

Non-superusers who are not "Admin" on the root dataverse will not be able to to do anything useful until the root dataverse has been published.

Customizing the Root Dataverse
++++++++++++++++++++++++++++++

As the person installing Dataverse you may or may not be a local metadata expert. You may want to have others sign up for accounts and grant them the "Admin" role at the root dataverse to configure metadata fields, templates, browse/search facets, guestbooks, etc. For more on these topics, consult the :doc:`/user/dataverse-management` section of the User Guide.

Once this configuration is complete, your Dataverse installation should be ready for users to start playing with. That said, there are many more configuration options available, which will be explained below.

Persistent Identifiers and Publishing Datasets
++++++++++++++++++++++++++++++++++++++++++++++
----------------------------------------------

Persistent identifiers are a required and integral part of the Dataverse platform. They provide a URL that is guaranteed to resolve to the datasets they represent. Dataverse currently supports creating identifiers using DOI and HDL.
Persistent identifiers are a required and integral part of the Dataverse platform. They provide a URL that is guaranteed to resolve to the datasets they represent. Dataverse currently supports creating identifiers using DOI and Handle.

By default and for testing convenience, the installer configures a temporary DOI test namespace through EZID. This is sufficient to create and publish datasets but they are not citable nor guaranteed to be preserved. Note that any datasets creating using the test configuration cannot be directly migrated and would need to be created again once a valid DOI namespace is configured.

To properly configure persistent identifiers for a production installation, an account and associated namespace must be acquired for a fee from a DOI or HDL provider: **EZID** (http://ezid.cdlib.org), **DataCite** (https://www.datacite.org), **Handle.Net** (https://www.handle.net).

Once account credentials and namespace have been acquired, please complete the identifier configuration parameters that are relevant to your installation. In the following list, parameters that apply only to DOI are preceded by "doi", those that apply only to handles include "handlenet", and those that apply to both kinds of installation contain neither.
Once you have your DOI or Handle account credentials and a namespace, configure Dataverse to use them using the JVM options and database settings below.

**JVM Options:** :ref:`doi.baseurlstring`, :ref:`doi.username`, :ref:`doi.password`, :ref:`dataverse.handlenet.admcredfile`, :ref:`dataverse.handlenet.admprivphrase`
Configuring Dataverse for DOIs
++++++++++++++++++++++++++++++

**Database Settings:** :ref:`:DoiProvider <:DoiProvider>`, :ref:`:Protocol <:Protocol>`, :ref:`:Authority <:Authority>`, :ref:`:DoiSeparator <:DoiSeparator>`
Out of the box, Dataverse is configured for DOIs. Here are the configuration options for DOIs:

Note: If you are **minting your own handles** and plan to set up your own handle service, please refer to `Handle.Net documentation <http://handle.net/hnr_documentation.html>`_.
**JVM Options:**

- :ref:`doi.baseurlstring`
- :ref:`doi.username`
- :ref:`doi.password`

**Database Settings:**

- :ref:`:DoiProvider <:DoiProvider>`
- :ref:`:Protocol <:Protocol>`
- :ref:`:Authority <:Authority>`
- :ref:`:DoiSeparator <:DoiSeparator>`

Customizing the Root Dataverse
++++++++++++++++++++++++++++++
Configuring Dataverse for Handles
+++++++++++++++++++++++++++++++++

As the person installing Dataverse you may or may not be local metadata expert. You may want to have others sign up for accounts and grant them the "Admin" role at the root dataverse to configure metadata fields, browse/search facets, templates, guestbooks, etc. For more on these topics, consult the :doc:`/user/dataverse-management` section of the User Guide.
Here are the configuration options for handles:

Once this configuration is complete, your Dataverse installation should be ready for users to start playing with it. That said, there are many more configuration options available, which will be explained below.
**JVM Options:**

- :ref:`dataverse.handlenet.admcredfile`
- :ref:`dataverse.handlenet.admprivphrase`

**Database Settings:**

- :ref:`:Protocol <:Protocol>`
- :ref:`:Authority <:Authority>`

Note: If you are **minting your own handles** and plan to set up your own handle service, please refer to `Handle.Net documentation <http://handle.net/hnr_documentation.html>`_.

Auth Modes: Local vs. Remote vs. Both
-------------------------------------
Expand Down
4 changes: 3 additions & 1 deletion doc/sphinx-guides/source/installation/intro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,4 +39,6 @@ To get help installing or configuring Dataverse, please try one or more of:
Improving this Guide
--------------------

If you spot a typo in this guide or would like to suggest an improvement, please find the appropriate file in https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/installation and send a pull request. You are also welcome to simply open an issue at https://github.com/IQSS/dataverse/issues to describe the problem with this guide.
If you spot a typo in this guide or would like to suggest an improvement, please find the appropriate file in https://github.com/IQSS/dataverse/tree/develop/doc/sphinx-guides/source/installation and send a pull request as explained in the :doc:`/developers/documentation` section of the Developer Guide. You are also welcome to simply open an issue at https://github.com/IQSS/dataverse/issues to describe the problem with this guide.

Next is the :doc:`prep` section.
7 changes: 4 additions & 3 deletions doc/sphinx-guides/source/installation/prep.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,11 @@ There are some community-lead projects to use configuration management tools suc

The Dataverse development team is happy to "bless" additional community efforts along these lines (i.e. Docker, Chef, Salt, etc.) by creating a repo under https://github.com/IQSS and managing team access.

Dataverse permits a fair amount of flexibility in where you choose to install the various components. The diagram below shows a load balancer, multiple proxies and web servers, redundant database servers, and offloading of potentially resource intensive work to a separate server. A setup such as this is advanced enough to be considered out of scope for this guide but you are welcome to ask questions about similar configurations via the support channels listed in the :doc:`intro`.
Dataverse permits a fair amount of flexibility in where you choose to install the various components. The diagram below shows a load balancer, multiple proxies and web servers, redundant database servers, and offloading of potentially resource intensive work to a separate server.

|3webservers|

A setup such as this is advanced enough to be considered out of scope for this guide but you are welcome to ask questions about similar configurations via the support channels listed in the :doc:`intro`.

.. _architecture:

Expand All @@ -56,7 +57,7 @@ When planning your installation you should be aware of the following components
- PostgreSQL: a relational database.
- Solr: a search engine. A Dataverse-specific schema is provided.
- SMTP server: for sending mail for password resets and other notifications.
- Persistent identifier service: DOI support is provided. An EZID subscription or DataCite account is required for production use.
- Persistent identifier service: DOI and Handle support are provided. Production use requires a registered DOI or Handle.net authority.

There are a number of optional components you may choose to install or configure, including:

Expand Down Expand Up @@ -98,7 +99,7 @@ Here are some questions to keep in the back of your mind as you test and move in
- How do I want my users to log in to Dataverse? With local accounts? With Shibboleth/SAML? With OAuth providers such as ORCID, GitHub, or Google?
- Do I want to to run Glassfish on the standard web ports (80 and 443) or do I want to "front" Glassfish with a proxy such as Apache or nginx? See "Network Ports" in the :doc:`config` section.
- How many points of failure am I willing to tolerate? How much complexity do I want?
- How much does it cost to subscribe to a service to create persistent identifiers such as DOIs?
- How much does it cost to subscribe to a service to create persistent identifiers such as DOIs or handles?

Next Steps
----------
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,10 +61,15 @@ public boolean registerWhenPublished() {
public boolean alreadyExists(Dataset dataset) throws Exception {
logger.log(Level.FINE,"alreadyExists");
try {
HashMap<String, String> result = ezidService.getMetadata(getIdentifierFromDataset(dataset));
HashMap<String, String> result = ezidService.getMetadata(getIdentifierFromDataset(dataset));
return result != null && !result.isEmpty();
// TODO just check for HTTP status code 200/404, sadly the status code is swept under the carpet
} catch (EZIDException e ){
//No such identifier is treated as an exception
//but if that is the case then we want to just return false
if (e.getLocalizedMessage().contains("no such identifier")){
return false;
}
logger.log(Level.WARNING, "alreadyExists failed");
logger.log(Level.WARNING, "String {0}", e.toString());
logger.log(Level.WARNING, "localized message {0}", e.getLocalizedMessage());
Expand Down
27 changes: 24 additions & 3 deletions src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -446,7 +446,8 @@ public DataFile findCheapAndEasy(Long id) {

dataFile.setOwner(owner);

// look up data table; but only if content type indicates it's tabular data:
// If content type indicates it's tabular data, spend 2 extra queries
// looking up the data table and tabular tags objects:

if (MIME_TYPE_TAB.equalsIgnoreCase(contentType)) {
Object[] dtResult = null;
Expand All @@ -471,6 +472,28 @@ public DataFile findCheapAndEasy(Long id) {

dataTable.setDataFile(dataFile);
dataFile.setDataTable(dataTable);

// tabular tags:

List<Object[]> tagResults = null;
try {
tagResults = em.createNativeQuery("SELECT t.TYPE, t.DATAFILE_ID FROM DATAFILETAG t WHERE t.DATAFILE_ID = " + id).getResultList();
} catch (Exception ex) {
logger.info("EXCEPTION looking up tags.");
tagResults = null;
}

if (tagResults != null) {
List<String> fileTagLabels = DataFileTag.listTags();

for (Object[] tagResult : tagResults) {
Integer tagId = (Integer)tagResult[0];
DataFileTag tag = new DataFileTag();
tag.setTypeByLabel(fileTagLabels.get(tagId));
tag.setDataFile(dataFile);
dataFile.addTag(tag);
}
}
}
}

Expand Down Expand Up @@ -1011,8 +1034,6 @@ public boolean isThumbnailAvailable (DataFile file) {
queries; checking if the thumbnail is available may cost cpu time, if
it has to be generated on the fly - so you have to figure out which
is more important...
TODO: adding a boolean flag isImageAlreadyGenerated to the DataFile
db table should help with this. -- L.A. 4.2.1 DONE: 4.2.2
*/

Expand Down
12 changes: 12 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/Dataset.java
Original file line number Diff line number Diff line change
Expand Up @@ -723,4 +723,16 @@ public DatasetThumbnail getDatasetThumbnail() {
return DatasetUtil.getThumbnail(this);
}

/**
* Handle the case where we also have the datasetVersionId.
* This saves trying to find the latestDatasetVersion, and
* other costly queries, etc.
*
* @param datasetVersionId
* @return
*/
public DatasetThumbnail getDatasetThumbnail(DatasetVersion datasetVersion) {
return DatasetUtil.getThumbnail(this, datasetVersion);
}

}
34 changes: 27 additions & 7 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -160,10 +160,16 @@ public enum DisplayMode {
DataverseRequestServiceBean dvRequestService;
@Inject
DatasetVersionUI datasetVersionUI;
@Inject PermissionsWrapper permissionsWrapper;
@Inject FileDownloadHelper fileDownloadHelper;
@Inject TwoRavensHelper twoRavensHelper;
@Inject WorldMapPermissionHelper worldMapPermissionHelper;
@Inject
PermissionsWrapper permissionsWrapper;
@Inject
FileDownloadHelper fileDownloadHelper;
@Inject
TwoRavensHelper twoRavensHelper;
@Inject
WorldMapPermissionHelper worldMapPermissionHelper;
@Inject
ThumbnailServiceWrapper thumbnailServiceWrapper;
@Inject
SettingsWrapper settingsWrapper;

Expand Down Expand Up @@ -242,13 +248,30 @@ public String getThumbnailString() {
return thumbnailString;
}

if (!readOnly) {
DatasetThumbnail datasetThumbnail = dataset.getDatasetThumbnail();
if (datasetThumbnail == null) {
thumbnailString = "";
return null;
}

if (datasetThumbnail.isFromDataFile()) {
if (!datasetThumbnail.getDataFile().equals(dataset.getThumbnailFile())) {
datasetService.assignDatasetThumbnailByNativeQuery(dataset, datasetThumbnail.getDataFile());
dataset = datasetService.find(dataset.getId());
}
}

thumbnailString = datasetThumbnail.getBase64image();
} else {
thumbnailString = thumbnailServiceWrapper.getDatasetCardImageAsBase64Url(dataset, workingVersion.getId());
if (thumbnailString == null) {
thumbnailString = "";
return null;
}


}
return thumbnailString;
}

Expand Down Expand Up @@ -2294,9 +2317,6 @@ public String save() {
//FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_ERROR, "Validation Error", "See below for details."));
return "";
}

// Finally, save the files permanently:
ingestService.addFiles(workingVersion, newFiles);

// Use the API to save the dataset:
Command<Dataset> cmd;
Expand Down
18 changes: 18 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -848,5 +848,23 @@ public Dataset removeDatasetThumbnail(Dataset dataset) {
dataset.setUseGenericThumbnail(true);
return merge(dataset);
}

// persist assigned thumbnail in a single one-field-update query:
// (the point is to avoid doing an em.merge() on an entire dataset object...)
public void assignDatasetThumbnailByNativeQuery(Long datasetId, Long dataFileId) {
try {
em.createNativeQuery("UPDATE dataset SET thumbnailfile_id=" + dataFileId + " WHERE id=" + datasetId).executeUpdate();
} catch (Exception ex) {
// it's ok to just ignore...
}
}

public void assignDatasetThumbnailByNativeQuery(Dataset dataset, DataFile dataFile) {
try {
em.createNativeQuery("UPDATE dataset SET thumbnailfile_id=" + dataFile.getId() + " WHERE id=" + dataset.getId()).executeUpdate();
} catch (Exception ex) {
// it's ok to just ignore...
}
}

}
Loading

0 comments on commit 1b763c6

Please sign in to comment.