diff --git a/doc/release-notes/4813-allow-duplicate-files.md b/doc/release-notes/4813-allow-duplicate-files.md new file mode 100644 index 00000000000..a11af77c72b --- /dev/null +++ b/doc/release-notes/4813-allow-duplicate-files.md @@ -0,0 +1 @@ +We should note that duplicate files are now allowed, and installations may want to contact people now that this is available. Point to rules in the Guides. \ No newline at end of file diff --git a/doc/release-notes/6918-publishing-lock.md b/doc/release-notes/6918-publishing-lock.md new file mode 100644 index 00000000000..c7b73fffee2 --- /dev/null +++ b/doc/release-notes/6918-publishing-lock.md @@ -0,0 +1,6 @@ +The setting :PIDAsynchRegFileCount is deprecated as of v5.0. + +It used to specify the number of datafiles in the dataset to warrant +adding a lock during publishing. As of v5.0 all datasets get +locked for the duration of the publishing process. The setting will be +ignored if present. diff --git a/doc/release-notes/6961-payara-upgrade.md b/doc/release-notes/6961-payara-upgrade.md new file mode 100644 index 00000000000..f590cdebc4e --- /dev/null +++ b/doc/release-notes/6961-payara-upgrade.md @@ -0,0 +1,82 @@ +Upgrade Dataverse from Glassfish 4.1 to Payara 5 +================================================ + +The instruction below describes the upgrade procedure based on moving an existing glassfish4 domain directory under Payara. We recommend this method, instead of setting up a brand-new Payara domain using the installer because it appears to be the easiest way to recreate your current configuration and preserve all your data. + +Download Payara, v5.2020.2 as of this writing: + + # curl -L -O https://github.com/payara/Payara/releases/download/payara-server-5.2020.2/payara-5.2020.2.zip + # sha256sum payara-5.2020.2.zip + 1f5f7ea30901b1b4c7bcdfa5591881a700c9b7e2022ae3894192ba97eb83cc3e + +Unzip it somewhere (/usr/local is a safe bet) + + # sudo unzip payara-5.2020.2.zip -d /usr/local/ + +Copy the Postgres driver to /usr/local/payara5/glassfish/lib + + # sudo cp /usr/local/glassfish4/glassfish/lib/postgresql-42.2.9.jar /usr/local/payara5/glassfish/lib/ + +Move payara5/glassfish/domains/domain1 out of the way + + # sudo mv /usr/local/payara5/glassfish/domains/domain1 /usr/local/payara5/glassfish/domains/domain1.orig + +Undeploy the Dataverse web application (if deployed; version 4.20 is assumed in the example below) + + # sudo /usr/local/glassfish4/bin/asadmin list-applications + # sudo /usr/local/glassfish4/bin/asadmin undeploy dataverse-4.20 + +Stop Glassfish; copy domain1 to Payara + + # sudo /usr/local/glassfish4/bin/asadmin stop-domain + # sudo cp -ar /usr/local/glassfish4/glassfish/domains/domain1 /usr/local/payara5/glassfish/domains/ + +Remove the Glassfish cache directories + + # sudo rm -rf /usr/local/payara5/glassfish/domains/domain1/generated/ + # sudo rm -rf /usr/local/payara5/glassfish/domains/domain1/osgi-cache/ + +In domain.xml: +============= + +Replace the -XX:PermSize and -XX:MaxPermSize JVM options with -XX:MetaspaceSize and -XX:MaxMetaspaceSize. + + -XX:MetaspaceSize=256m + -XX:MaxMetaspaceSize=512m + +Set both Xmx and Xms at startup to avoid runtime re-allocation. Your Xmx value should likely be higher: + + -Xmx2048m + -Xms2048m + +Add the below JVM options beneath the -Ddataverse settings: + + -Dfish.payara.classloading.delegate=false + -XX:+UseG1GC + -XX:+UseStringDeduplication + -XX:+DisableExplicitGC + +Change any full pathnames /usr/local/glassfish4/... to /usr/local/payara5/... or whatever it is in your case. (Specifically check the -Ddataverse.files.directory and -Ddataverse.files.file.directory JVM options) + +In domain1/config/jhove.conf, change the hard-coded /usr/local/glassfish4 path, as above. + +(Optional): If you renamed your service account from glassfish to payara or appserver, update the ownership permissions. The Installation Guide recommends a service account of `dataverse`: + + # sudo chown -R dataverse /usr/local/payara5/glassfish/domains/domain1 + # sudo chown -R dataverse /usr/local/payara5/glassfish/lib + +You will also need to check that the service account has write permission on the files directory, if they are located outside the old Glassfish domain. And/or make sure the service account has the correct AWS credentials, if you are using S3 for storage. + +Finally, start Payara: + + # sudo -u dataverse /usr/local/payara5/bin/asadmin start-domain + +Deploy the Dataverse 5 warfile: + + # sudo -u dataverse /usr/local/payara5/bin/asadmin deploy /path/to/dataverse-5.0.war + +Then restart Payara: + + # sudo -u dataverse /usr/local/payara5/bin/asadmin stop-domain + # sudo -u dataverse /usr/local/payara5/bin/asadmin start-domain + diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst index 2e22ee77571..48b4b6608b7 100644 --- a/doc/sphinx-guides/source/installation/config.rst +++ b/doc/sphinx-guides/source/installation/config.rst @@ -186,7 +186,6 @@ Here are the configuration options for DOIs: - :ref:`:IdentifierGenerationStyle <:IdentifierGenerationStyle>` (optional) - :ref:`:DataFilePIDFormat <:DataFilePIDFormat>` (optional) - :ref:`:FilePIDsEnabled <:FilePIDsEnabled>` (optional, defaults to true) -- :ref:`:PIDAsynchRegFileCount <:PIDAsynchRegFileCount>` (optional, defaults to 10) Configuring Dataverse for Handles +++++++++++++++++++++++++++++++++ @@ -1446,24 +1445,13 @@ Note that in either case, when using the ``sequentialNumber`` option, datasets a :FilePIDsEnabled ++++++++++++++++ -Toggles publishing of file-based PIDs for the entire installation. By default this setting is absent and Dataverse assumes it to be true. +Toggles publishing of file-based PIDs for the entire installation. By default this setting is absent and Dataverse assumes it to be true. If enabled, the registration will be performed asynchronously (in the background) during publishing of a dataset. If you don't want to register file-based PIDs for your installation, set: ``curl -X PUT -d 'false' http://localhost:8080/api/admin/settings/:FilePIDsEnabled`` -Note: File-level PID registration was added in 4.9 and is required until version 4.9.3. - -Note: The dataset will be locked, and the registration will be performed asynchronously, when there are more than N files in the dataset, where N is configured by the database setting ``:PIDAsynchRegFileCount`` (default: 10). - -.. _:PIDAsynchRegFileCount: - -:PIDAsynchRegFileCount -++++++++++++++++++++++ - -Configures the number of files in the dataset to warrant performing the registration of persistent identifiers (section above) and/or file validation asynchronously during publishing. The setting is optional, and the default value is 10. - -``curl -X PUT -d '100' http://localhost:8080/api/admin/settings/:PIDAsynchRegFileCount`` +Note: File-level PID registration was added in 4.9; it could not be disabled until version 4.9.3. .. _:IndependentHandleService: @@ -1480,14 +1468,12 @@ By default this setting is absent and Dataverse assumes it to be false. :FileValidationOnPublishEnabled +++++++++++++++++++++++++++++++ -Toggles validation of the physical files in the dataset when it's published, by recalculating the checksums and comparing against the values stored in the DataFile table. By default this setting is absent and Dataverse assumes it to be true. +Toggles validation of the physical files in the dataset when it's published, by recalculating the checksums and comparing against the values stored in the DataFile table. By default this setting is absent and Dataverse assumes it to be true. If enabled, the validation will be performed asynchronously, similarly to how we handle assigning persistent identifiers to datafiles, with the dataset locked for the duration of the publishing process. If you don't want the datafiles to be validated on publish, set: ``curl -X PUT -d 'false' http://localhost:8080/api/admin/settings/:FileValidationOnPublishEnabled`` -Note: The dataset will be locked, and the validation will be performed asynchronously, similarly to how we handle assigning persistend identifiers to datafiles, when there are more than N files in the dataset, where N is configured by the database setting ``:PIDAsynchRegFileCount`` (default: 10). - :ApplicationTermsOfUse ++++++++++++++++++++++ diff --git a/doc/sphinx-guides/source/user/dataset-management.rst b/doc/sphinx-guides/source/user/dataset-management.rst index f9ce457f5c0..e377d3d9855 100755 --- a/doc/sphinx-guides/source/user/dataset-management.rst +++ b/doc/sphinx-guides/source/user/dataset-management.rst @@ -73,7 +73,6 @@ You can upload files to a dataset while first creating that dataset. You can als Certain file types in Dataverse are supported by additional functionality, which can include downloading in different formats, previews, file-level metadata preservation, file-level data citation with UNFs, and exploration through data visualization and analysis. See the :ref:`File Handling ` section of this page for more information. - HTTP Upload ----------- @@ -147,6 +146,20 @@ File Handling Certain file types in Dataverse are supported by additional functionality, which can include downloading in different formats, previews, file-level metadata preservation, file-level data citation; and exploration through data visualization and analysis. See the sections below for information about special functionality for specific file types. +.. _duplicate-files: + +Duplicate Files +=============== + +Beginning with Dataverse 5.0, the way Dataverse handles duplicate files (filename and checksums) is changing to be more flexible. Specifically: + +- Files with the same checksum can be included in a dataset, even if the files are in the same directory. +- Files with the same filename can be included in a dataset as long as the files are in different directories. +- If a user uploads a file to a directory where a file already exists with that directory/filename combination, Dataverse will adjust the file path and names by adding "-1" or "-2" as applicable. This change will be visible in the list of files being uploaded. +- If the directory or name of an existing or newly uploaded file is edited in such a way that would create a directory/filename combination that already exists, Dataverse will display an error. +- If a user attempts to replace a file with another file that has the same checksum, an error message will be displayed and the file will not be able to be replaced. +- If a user attempts to replace a file with a file that has the same checksum as a different file in the dataset, a warning will be displayed. + File Previews ------------- @@ -268,7 +281,7 @@ Variable Metadata can be edited directly through an API call (:ref:`API Guide: E File Path --------- -The File Path metadata field is Dataverse's way of representing a file's location in a folder structure. When a user uploads a .zip file containing a folder structure, Dataverse automatically fills in the File Path information for each file contained in the .zip. If a user downloads the full dataset or a selection of files from it, they will receive a folder structure with each file positioned according to its File Path. +The File Path metadata field is Dataverse's way of representing a file's location in a folder structure. When a user uploads a .zip file containing a folder structure, Dataverse automatically fills in the File Path information for each file contained in the .zip. If a user downloads the full dataset or a selection of files from it, they will receive a folder structure with each file positioned according to its File Path. Only one file with a given path and name may exist in a dataset. Editing a file to give it the same path and name as another file already existing in the dataset will cause an error. A file's File Path can be manually added or edited on the Edit Files page. Changing a file's File Path will change its location in the folder structure that is created when a user downloads the full dataset or a selection of files from it. diff --git a/src/main/java/edu/harvard/iq/dataverse/DataFile.java b/src/main/java/edu/harvard/iq/dataverse/DataFile.java index 6218629549c..560048db9ca 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DataFile.java +++ b/src/main/java/edu/harvard/iq/dataverse/DataFile.java @@ -254,6 +254,37 @@ public boolean isDeleted() { public void setDeleted(boolean deleted) { this.deleted = deleted; } + + /* + For use during file upload so that the user may delete + files that have already been uploaded to the current dataset version + */ + + @Transient + private boolean markedAsDuplicate; + + public boolean isMarkedAsDuplicate() { + return markedAsDuplicate; + } + + public void setMarkedAsDuplicate(boolean markedAsDuplicate) { + this.markedAsDuplicate = markedAsDuplicate; + } + + @Transient + private String duplicateFilename; + + public String getDuplicateFilename() { + return duplicateFilename; + } + + public void setDuplicateFilename(String duplicateFilename) { + this.duplicateFilename = duplicateFilename; + } + + + + /** * All constructors should use this method diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java index 8ec2289d38e..c3a76a53a16 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java @@ -2128,6 +2128,15 @@ private void displayLockInfo(Dataset dataset) { BundleUtil.getStringFromBundle("dataset.locked.ingest.message")); lockedDueToIngestVar = true; } + + // With DataCite, we try to reserve the DOI when the dataset is created. Sometimes this + // fails because DataCite is down. We show the message below to set expectations that the + // "Publish" button won't work until the DOI has been reserved using the "Reserve PID" API. + if (settingsWrapper.isDataCiteInstallation() && dataset.getGlobalIdCreateTime() == null && editMode != EditMode.CREATE) { + JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.locked.pidNotReserved.message"), + BundleUtil.getStringFromBundle("dataset.locked.pidNotReserved.message.details")); + } + } private Boolean fileTreeViewRequired = null; @@ -2649,7 +2658,7 @@ private String releaseDataset(boolean minor) { } else { JsfHelper.addErrorMessage(BundleUtil.getStringFromBundle("dataset.message.only.authenticatedUsers")); } - return returnToDatasetOnly(); + return returnToDraftVersion(); } @Deprecated diff --git a/src/main/java/edu/harvard/iq/dataverse/DataverseSession.java b/src/main/java/edu/harvard/iq/dataverse/DataverseSession.java index 27ee982653a..d99937a3a09 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DataverseSession.java +++ b/src/main/java/edu/harvard/iq/dataverse/DataverseSession.java @@ -6,6 +6,7 @@ import edu.harvard.iq.dataverse.actionlogging.ActionLogServiceBean; import edu.harvard.iq.dataverse.authorization.users.GuestUser; import edu.harvard.iq.dataverse.authorization.users.User; +import edu.harvard.iq.dataverse.util.SessionUtil; import edu.harvard.iq.dataverse.util.SystemConfig; import java.io.IOException; import java.io.Serializable; @@ -61,7 +62,8 @@ public void setUser(User aUser) { logSvc.log( new ActionLogRecord(ActionLogRecord.ActionType.SessionManagement,(aUser==null) ? "logout" : "login") .setUserIdentifier((aUser!=null) ? aUser.getIdentifier() : (user!=null ? user.getIdentifier() : "") )); - + //#3254 - change session id when user changes + SessionUtil.changeSessionId((HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest()); this.user = aUser; } diff --git a/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java b/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java index f8663087219..5ca7e4df502 100644 --- a/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java +++ b/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java @@ -84,7 +84,6 @@ public class EditDatafilesPage implements java.io.Serializable { private static final Logger logger = Logger.getLogger(EditDatafilesPage.class.getCanonicalName()); - private boolean uploadWarningMessageIsNotAnError; public enum FileEditMode { @@ -142,6 +141,8 @@ public enum FileEditMode { private Long versionId; private List newFiles = new ArrayList<>(); private List uploadedFiles = new ArrayList<>(); + private List uploadedInThisProcess = new ArrayList<>(); + private DatasetVersion workingVersion; private DatasetVersion clone; private String dropBoxSelection = ""; @@ -869,10 +870,25 @@ private String getBundleString(String msgName){ public void deleteFilesCompleted(){ } - - public void deleteFiles() { + + public void deleteFiles(){ + deleteFiles(this.selectedFiles); + } + + public void deleteDuplicateFiles(){ + List filesForDelete = new ArrayList(); + for(DataFile df : newFiles ){ + if (df.isMarkedAsDuplicate()){ + filesForDelete.add(df.getFileMetadata()); + } + } + deleteFiles(filesForDelete); + } + + + private void deleteFiles(List filesForDelete) { logger.fine("entering bulk file delete (EditDataFilesPage)"); - if (isFileReplaceOperation()){ + if (isFileReplaceOperation()) { try { deleteReplacementFile(); } catch (FileReplaceException ex) { @@ -880,10 +896,17 @@ public void deleteFiles() { } return; } - + + /* + If selected files are empty it means that we are dealing + with a duplicate files delete situation + so we are adding the marked as dup files as selected + and moving on accordingly. + */ + String fileNames = null; - for (FileMetadata fmd : this.getSelectedFiles()) { - // collect the names of the files, + for (FileMetadata fmd : filesForDelete) { + // collect the names of the files, // to show in the success message: if (fileNames == null) { fileNames = fmd.getLabel(); @@ -892,29 +915,29 @@ public void deleteFiles() { } } - for (FileMetadata markedForDelete : this.getSelectedFiles()) { - logger.fine("delete requested on file "+markedForDelete.getLabel()); - logger.fine("file metadata id: "+markedForDelete.getId()); - logger.fine("datafile id: "+markedForDelete.getDataFile().getId()); - logger.fine("page is in edit mode "+mode.name()); - - // has this filemetadata been saved already? (or is it a brand new - // filemetadata, created as part of a brand new version, created when - // the user clicked 'delete', that hasn't been saved in the db yet?) - if (markedForDelete.getId() != null) { - logger.fine("this is a filemetadata from an existing draft version"); + for (FileMetadata markedForDelete : filesForDelete) { + logger.fine("delete requested on file " + markedForDelete.getLabel()); + logger.fine("file metadata id: " + markedForDelete.getId()); + logger.fine("datafile id: " + markedForDelete.getDataFile().getId()); + logger.fine("page is in edit mode " + mode.name()); + + // has this filemetadata been saved already? (or is it a brand new + // filemetadata, created as part of a brand new version, created when + // the user clicked 'delete', that hasn't been saved in the db yet?) + if (markedForDelete.getId() != null) { + logger.fine("this is a filemetadata from an existing draft version"); // so all we remove is the file from the fileMetadatas (from the // file metadatas attached to the editVersion, and from the // display list of file metadatas that are being edited) // and let the delete be handled in the command (by adding it to the // filesToBeDeleted list): - dataset.getEditVersion().getFileMetadatas().remove(markedForDelete); - fileMetadatas.remove(markedForDelete); - filesToBeDeleted.add(markedForDelete); - } else { - logger.fine("this is a brand-new (unsaved) filemetadata"); - // ok, this is a brand-new DRAFT version. + dataset.getEditVersion().getFileMetadatas().remove(markedForDelete); + fileMetadatas.remove(markedForDelete); + filesToBeDeleted.add(markedForDelete); + } else { + logger.fine("this is a brand-new (unsaved) filemetadata"); + // ok, this is a brand-new DRAFT version. // if (mode != FileEditMode.CREATE) { // If the bean is in the 'CREATE' mode, the page is using @@ -922,38 +945,48 @@ public void deleteFiles() { // so there's no need to delete this meta from the local // fileMetadatas list. (but doing both just adds a no-op and won't cause an // error) - - // 1. delete the filemetadata from the local display list: + // 1. delete the filemetadata from the local display list: removeFileMetadataFromList(fileMetadatas, markedForDelete); - // 2. delete the filemetadata from the version: + // 2. delete the filemetadata from the version: removeFileMetadataFromList(dataset.getEditVersion().getFileMetadatas(), markedForDelete); - } - + } if (markedForDelete.getDataFile().getId() == null) { logger.fine("this is a brand new file."); // the file was just added during this step, so in addition to // removing it from the fileMetadatas lists (above), we also remove it from // the newFiles list and the dataset's files, so it never gets saved. - + removeDataFileFromList(dataset.getFiles(), markedForDelete.getDataFile()); removeDataFileFromList(newFiles, markedForDelete.getDataFile()); FileUtil.deleteTempFile(markedForDelete.getDataFile(), dataset, ingestService); // Also remove checksum from the list of newly uploaded checksums (perhaps odd // to delete and then try uploading the same file again, but it seems like it // should be allowed/the checksum list is part of the state to clean-up - checksumMapNew.remove(markedForDelete.getDataFile().getChecksumValue()); - - } - } + if(checksumMapNew != null && markedForDelete.getDataFile().getChecksumValue() != null) + checksumMapNew.remove(markedForDelete.getDataFile().getChecksumValue()); + + } + } + if (fileNames != null) { - String successMessage = getBundleString("file.deleted.success"); + String successMessage; + if (mode == FileEditMode.UPLOAD) { + if (fileNames.contains(", ")) { + successMessage = getBundleString("file.deleted.upload.success.multiple"); + } else { + successMessage = getBundleString("file.deleted.upload.success.single"); + } + } else { + successMessage = getBundleString("file.deleted.success"); + successMessage = successMessage.replace("{0}", fileNames); + } logger.fine(successMessage); - successMessage = successMessage.replace("{0}", fileNames); JsfHelper.addFlashMessage(successMessage); - } - } - + } + } + + private void removeFileMetadataFromList(List fmds, FileMetadata fmToDelete) { Iterator fmit = fmds.iterator(); while (fmit.hasNext()) { @@ -1032,7 +1065,8 @@ public String saveReplacementFile() throws FileReplaceException{ } public String save() { - Collection duplicates = IngestUtil.findDuplicateFilenames(workingVersion); + + Collection duplicates = IngestUtil.findDuplicateFilenames(workingVersion, newFiles); if (!duplicates.isEmpty()) { JH.addMessage(FacesMessage.SEVERITY_ERROR, BundleUtil.getStringFromBundle("dataset.message.filesFailure"), BundleUtil.getStringFromBundle("dataset.message.editMetadata.duplicateFilenames", new ArrayList<>(duplicates))); return null; @@ -1737,6 +1771,7 @@ public void uploadFinished() { newFiles.add(dataFile); } + if(uploadInProgress.isTrue()) { uploadedFiles.clear(); @@ -1745,44 +1780,104 @@ public void uploadFinished() { // refresh the warning message below the upload component, if exists: if (uploadComponentId != null) { if (uploadWarningMessage != null) { - if (uploadWarningMessageIsNotAnError) { - FacesContext.getCurrentInstance().addMessage(uploadComponentId, new FacesMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.file.uploadWarning"), uploadWarningMessage)); - } else { - FacesContext.getCurrentInstance().addMessage(uploadComponentId, new FacesMessage(FacesMessage.SEVERITY_ERROR, BundleUtil.getStringFromBundle("dataset.file.uploadWarning"), uploadWarningMessage)); + if (existingFilesWithDupeContent != null || newlyUploadedFilesWithDupeContent != null) { + setWarningMessageForAlreadyExistsPopUp(uploadWarningMessage); + setHeaderForAlreadyExistsPopUp(); + setLabelForDeleteFilesPopup(); + PrimeFaces.current().ajax().update("datasetForm:fileAlreadyExistsPopup"); + PrimeFaces.current().executeScript("PF('fileAlreadyExistsPopup').show();"); } + + + //taking this out for now based on design feedback 7/8/2020 + // FacesContext.getCurrentInstance().addMessage(uploadComponentId, new FacesMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.file.uploadWarning"), uploadWarningMessage)); + } else if (uploadSuccessMessage != null) { FacesContext.getCurrentInstance().addMessage(uploadComponentId, new FacesMessage(FacesMessage.SEVERITY_INFO, BundleUtil.getStringFromBundle("dataset.file.uploadWorked"), uploadSuccessMessage)); } } - if(isFileReplaceOperation() && fileReplacePageHelper.hasContentTypeWarning()){ + if(isFileReplaceOperation() && fileReplacePageHelper.wasPhase1Successful() && fileReplacePageHelper.hasContentTypeWarning()){ //RequestContext context = RequestContext.getCurrentInstance(); //RequestContext.getCurrentInstance().update("datasetForm:fileTypeDifferentPopup"); PrimeFaces.current().ajax().update("datasetForm:fileTypeDifferentPopup"); //context.execute("PF('fileTypeDifferentPopup').show();"); PrimeFaces.current().executeScript("PF('fileTypeDifferentPopup').show();"); } - + + if(isFileReplaceOperation() && fileReplacePageHelper.getAddReplaceFileHelper().isDuplicateFileErrorFound() ) { + FacesContext.getCurrentInstance().addMessage(uploadComponentId, new FacesMessage(FacesMessage.SEVERITY_ERROR, fileReplacePageHelper.getAddReplaceFileHelper().getDuplicateFileErrorString(), fileReplacePageHelper.getAddReplaceFileHelper().getDuplicateFileErrorString())); + } + + if (isFileReplaceOperation() && !fileReplacePageHelper.getAddReplaceFileHelper().isDuplicateFileErrorFound() && fileReplacePageHelper.getAddReplaceFileHelper().isDuplicateFileWarningFound()) { + setWarningMessageForAlreadyExistsPopUp(fileReplacePageHelper.getAddReplaceFileHelper().getDuplicateFileWarningString()); + setHeaderForAlreadyExistsPopUp(); + setLabelForDeleteFilesPopup(); + PrimeFaces.current().ajax().update("datasetForm:fileAlreadyExistsPopup"); + PrimeFaces.current().executeScript("PF('fileAlreadyExistsPopup').show();"); + } // We clear the following duplicate warning labels, because we want to // only inform the user of the duplicates dropped in the current upload // attempt - for ex., one batch of drag-and-dropped files, or a single // file uploaded through the file chooser. - dupeFileNamesExisting = null; - dupeFileNamesNew = null; + newlyUploadedFilesWithDupeContent = null; + existingFilesWithDupeContent = null; multipleDupesExisting = false; multipleDupesNew = false; uploadWarningMessage = null; uploadSuccessMessage = null; } - private String warningMessageForPopUp; + private String warningMessageForFileTypeDifferentPopUp; - public String getWarningMessageForPopUp() { - return warningMessageForPopUp; + public String getWarningMessageForFileTypeDifferentPopUp() { + return warningMessageForFileTypeDifferentPopUp; } - public void setWarningMessageForPopUp(String warningMessageForPopUp) { - this.warningMessageForPopUp = warningMessageForPopUp; + public void setWarningMessageForFileTypeDifferentPopUp(String warningMessageForPopUp) { + this.warningMessageForFileTypeDifferentPopUp = warningMessageForPopUp; + } + + private String warningMessageForAlreadyExistsPopUp; + + public String getWarningMessageForAlreadyExistsPopUp() { + return warningMessageForAlreadyExistsPopUp; + } + + public void setWarningMessageForAlreadyExistsPopUp(String warningMessageForAlreadyExistsPopUp) { + this.warningMessageForAlreadyExistsPopUp = warningMessageForAlreadyExistsPopUp; + } + + private String headerForAlreadyExistsPopUp; + + public String getHeaderForAlreadyExistsPopUp() { + return headerForAlreadyExistsPopUp; + } + + public void setHeaderForAlreadyExistsPopUp(String headerForAlreadyExistsPopUp) { + this.headerForAlreadyExistsPopUp = headerForAlreadyExistsPopUp; + } + + private String labelForDeleteFilesPopup; + + public String getLabelForDeleteFilesPopup() { + return labelForDeleteFilesPopup; + } + + public void setLabelForDeleteFilesPopup(String labelForDeleteFilesPopup) { + this.labelForDeleteFilesPopup = labelForDeleteFilesPopup; + } + + public void setLabelForDeleteFilesPopup() { + this.labelForDeleteFilesPopup = ((multipleDupesExisting|| multipleDupesNew) ? BundleUtil.getStringFromBundle("file.delete.duplicate.multiple") : + BundleUtil.getStringFromBundle("file.delete.duplicate.single")); + } + + //((multipleDupesExisting|| multipleDupesNew) ? BundleUtil.getStringFromBundle("file.addreplace.already_exists.header.multiple"): BundleUtil.getStringFromBundle("file.addreplace.already_exists.header")); + + public void setHeaderForAlreadyExistsPopUp() { + + this.headerForAlreadyExistsPopUp = ((multipleDupesExisting|| multipleDupesNew) ? BundleUtil.getStringFromBundle("file.addreplace.already_exists.header.multiple"): BundleUtil.getStringFromBundle("file.addreplace.already_exists.header")); } private void handleReplaceFileUpload(FacesEvent event, InputStream inputStream, @@ -1810,7 +1905,7 @@ private void handleReplaceFileUpload(FacesEvent event, InputStream inputStream, */ if (fileReplacePageHelper.hasContentTypeWarning()){ //Add warning to popup instead of page for Content Type Difference - setWarningMessageForPopUp(fileReplacePageHelper.getContentTypeWarning()); + setWarningMessageForFileTypeDifferentPopUp(fileReplacePageHelper.getContentTypeWarning()); /* Note on the info messages - upload errors, warnings and success messages: Instead of trying to display the message here (commented out code below), @@ -1874,7 +1969,7 @@ private void handleReplaceFileUpload(String fullStorageLocation, */ if (fileReplacePageHelper.hasContentTypeWarning()){ //Add warning to popup instead of page for Content Type Difference - setWarningMessageForPopUp(fileReplacePageHelper.getContentTypeWarning()); + setWarningMessageForFileTypeDifferentPopUp(fileReplacePageHelper.getContentTypeWarning()); } } else { uploadWarningMessage = fileReplacePageHelper.getErrorMessages(); @@ -1895,6 +1990,14 @@ public void handleFileUpload(FileUploadEvent event) throws IOException { if (uploadInProgress.isFalse()) { uploadInProgress.setValue(true); } + + //resetting marked as dup in case there are multiple uploads + //we only want to delete as dupes those that we uploaded in this + //session + + newFiles.forEach((df) -> { + df.setMarkedAsDuplicate(false); + }); if (event == null){ throw new NullPointerException("event cannot be null"); @@ -1916,13 +2019,24 @@ public void handleFileUpload(FileUploadEvent event) throws IOException { uFile.getContentType(), event, null); - if(fileReplacePageHelper.hasContentTypeWarning()){ + if( fileReplacePageHelper.wasPhase1Successful() && fileReplacePageHelper.hasContentTypeWarning()){ + //RequestContext context = RequestContext.getCurrentInstance(); + //RequestContext.getCurrentInstance().update("datasetForm:fileTypeDifferentPopup"); + //context.execute("PF('fileTypeDifferentPopup').show();"); + PrimeFaces.current().ajax().update("datasetForm:fileTypeDifferentPopup"); + PrimeFaces.current().executeScript("PF('fileTypeDifferentPopup').show();"); + } + /* + + + if(fileReplacePageHelper.){ //RequestContext context = RequestContext.getCurrentInstance(); //RequestContext.getCurrentInstance().update("datasetForm:fileTypeDifferentPopup"); //context.execute("PF('fileTypeDifferentPopup').show();"); PrimeFaces.current().ajax().update("datasetForm:fileTypeDifferentPopup"); PrimeFaces.current().executeScript("PF('fileTypeDifferentPopup').show();"); } + */ return; } @@ -2079,15 +2193,69 @@ public void handleExternalUpload() { * @param dFileList */ - private String dupeFileNamesExisting = null; - private String dupeFileNamesNew = null; + private String existingFilesWithDupeContent = null; + private String uploadedFilesWithDupeContentToExisting = null; + private String uploadedFilesWithDupeContentToNewlyUploaded = null; + private String newlyUploadedFilesWithDupeContent = null; + private boolean multipleDupesExisting = false; private boolean multipleDupesNew = false; + public String getExistingFilesWithDupeContent() { + return existingFilesWithDupeContent; + } + + public void setExistingFilesWithDupeContent(String existingFilesWithDupeContent) { + this.existingFilesWithDupeContent = existingFilesWithDupeContent; + } + + public String getUploadedFilesWithDupeContentToExisting() { + return uploadedFilesWithDupeContentToExisting; + } + + public void setUploadedFilesWithDupeContentToExisting(String uploadedFilesWithDupeContentToExisting) { + this.uploadedFilesWithDupeContentToExisting = uploadedFilesWithDupeContentToExisting; + } + + public String getUploadedFilesWithDupeContentToNewlyUploaded() { + return uploadedFilesWithDupeContentToNewlyUploaded; + } + + public void setUploadedFilesWithDupeContentToNewlyUploaded(String uploadedFilesWithDupeContentToNewlyUploaded) { + this.uploadedFilesWithDupeContentToNewlyUploaded = uploadedFilesWithDupeContentToNewlyUploaded; + } + + public String getNewlyUploadedFilesWithDupeContent() { + return newlyUploadedFilesWithDupeContent; + } + + public void setNewlyUploadedFilesWithDupeContent(String newlyUploadedFilesWithDupeContent) { + this.newlyUploadedFilesWithDupeContent = newlyUploadedFilesWithDupeContent; + } + + + public boolean isMultipleDupesExisting() { + return multipleDupesExisting; + } + + public void setMultipleDupesExisting(boolean multipleDupesExisting) { + this.multipleDupesExisting = multipleDupesExisting; + } + + public boolean isMultipleDupesNew() { + return multipleDupesNew; + } + + public void setMultipleDupesNew(boolean multipleDupesNew) { + this.multipleDupesNew = multipleDupesNew; + } + private String processUploadedFileList(List dFileList) { if (dFileList == null) { return null; } + + uploadedInThisProcess = new ArrayList(); DataFile dataFile; String warningMessage = null; @@ -2119,39 +2287,71 @@ private String processUploadedFileList(List dFileList) { // or if another file with the same checksum has already been // uploaded. // ----------------------------------------------------------- + if (isFileAlreadyInDataset(dataFile)) { - if (dupeFileNamesExisting == null) { - dupeFileNamesExisting = dataFile.getFileMetadata().getLabel(); + DataFile existingFile = fileAlreadyExists.get(dataFile); + + // String alreadyExists = dataFile.getFileMetadata().getLabel() + " at " + existingFile.getDirectoryLabel() != null ? existingFile.getDirectoryLabel() + "/" + existingFile.getDisplayName() : existingFile.getDisplayName(); + String uploadedDuplicateFileName = dataFile.getFileMetadata().getLabel(); + String existingFileName = existingFile.getDisplayName(); + List args = Arrays.asList(existingFileName); + String inLineMessage = BundleUtil.getStringFromBundle("dataset.file.inline.message", args); + + if (existingFilesWithDupeContent == null) { + existingFilesWithDupeContent = existingFileName; + uploadedFilesWithDupeContentToExisting = uploadedDuplicateFileName; } else { - dupeFileNamesExisting = dupeFileNamesExisting.concat(", " + dataFile.getFileMetadata().getLabel()); + existingFilesWithDupeContent = existingFilesWithDupeContent.concat(", " + existingFileName); + uploadedFilesWithDupeContentToExisting = uploadedFilesWithDupeContentToExisting.concat(", " + uploadedDuplicateFileName); multipleDupesExisting = true; } - // remove temp file - FileUtil.deleteTempFile(dataFile, dataset, ingestService); + //now we are marking as duplicate and + //allowing the user to decide whether to delete + // deleteTempFile(dataFile); + dataFile.setMarkedAsDuplicate(true); + dataFile.setDuplicateFilename(inLineMessage); + } else if (isFileAlreadyUploaded(dataFile)) { - if (dupeFileNamesNew == null) { - dupeFileNamesNew = dataFile.getFileMetadata().getLabel(); + DataFile existingFile = checksumMapNew.get(dataFile.getChecksumValue()); + String alreadyUploadedWithSame = existingFile.getDisplayName(); + String newlyUploadedDupe = dataFile.getFileMetadata().getLabel(); + if (newlyUploadedFilesWithDupeContent == null) { + newlyUploadedFilesWithDupeContent = newlyUploadedDupe; + uploadedFilesWithDupeContentToNewlyUploaded = alreadyUploadedWithSame; } else { - dupeFileNamesNew = dupeFileNamesNew.concat(", " + dataFile.getFileMetadata().getLabel()); + newlyUploadedFilesWithDupeContent = newlyUploadedFilesWithDupeContent.concat(", " + newlyUploadedDupe); + uploadedFilesWithDupeContentToNewlyUploaded = uploadedFilesWithDupeContentToNewlyUploaded.concat(", " + alreadyUploadedWithSame); multipleDupesNew = true; } - // remove temp file - FileUtil.deleteTempFile(dataFile, dataset, ingestService); + //now we are marking as duplicate and + //allowing the user to decide whether to delete + dataFile.setMarkedAsDuplicate(true); + List args = Arrays.asList(existingFile.getDisplayName()); + String inLineMessage = BundleUtil.getStringFromBundle("dataset.file.inline.message", args); + dataFile.setDuplicateFilename(inLineMessage); } else { // OK, this one is not a duplicate, we want it. // But let's check if its filename is a duplicate of another // file already uploaded, or already in the dataset: + /* dataFile.getFileMetadata().setLabel(duplicateFilenameCheck(dataFile.getFileMetadata())); if (isTemporaryPreviewAvailable(dataFile.getStorageIdentifier(), dataFile.getContentType())) { dataFile.setPreviewImageAvailable(true); } uploadedFiles.add(dataFile); + */ // We are NOT adding the fileMetadata to the list that is being used // to render the page; we'll do that once we know that all the individual uploads // in this batch (as in, a bunch of drag-and-dropped files) have finished. //fileMetadatas.add(dataFile.getFileMetadata()); } - + + dataFile.getFileMetadata().setLabel(duplicateFilenameCheck(dataFile.getFileMetadata())); + if (isTemporaryPreviewAvailable(dataFile.getStorageIdentifier(), dataFile.getContentType())) { + dataFile.setPreviewImageAvailable(true); + } + uploadedFiles.add(dataFile); + uploadedInThisProcess.add(dataFile); /* preserved old, pre 4.6 code - mainly as an illustration of how we used to do this. @@ -2194,32 +2394,35 @@ private String processUploadedFileList(List dFileList) { // (note the separate messages for the files already in the dataset, // and the newly uploaded ones) // ----------------------------------------------------------- - if (dupeFileNamesExisting != null) { + if (existingFilesWithDupeContent != null) { String duplicateFilesErrorMessage = null; - if (multipleDupesExisting) { - duplicateFilesErrorMessage = getBundleString("dataset.files.exist") + dupeFileNamesExisting + getBundleString("dataset.file.skip"); + List args = Arrays.asList(uploadedFilesWithDupeContentToExisting, existingFilesWithDupeContent); + + if (multipleDupesExisting) { + duplicateFilesErrorMessage = BundleUtil.getStringFromBundle("dataset.files.exist", args); } else { - duplicateFilesErrorMessage = getBundleString("dataset.file.exist") + dupeFileNamesExisting; + duplicateFilesErrorMessage = BundleUtil.getStringFromBundle("dataset.file.exist", args); } if (warningMessage == null) { warningMessage = duplicateFilesErrorMessage; } else { - warningMessage = warningMessage.concat("; " + duplicateFilesErrorMessage); + warningMessage = warningMessage.concat(" " + duplicateFilesErrorMessage); } } - if (dupeFileNamesNew != null) { + if (newlyUploadedFilesWithDupeContent != null) { String duplicateFilesErrorMessage = null; - if (multipleDupesNew) { - duplicateFilesErrorMessage = getBundleString("dataset.files.duplicate") + dupeFileNamesNew + getBundleString("dataset.file.skip"); + List args = Arrays.asList(newlyUploadedFilesWithDupeContent, uploadedFilesWithDupeContentToNewlyUploaded); + + if (multipleDupesNew) { + duplicateFilesErrorMessage = BundleUtil.getStringFromBundle("dataset.files.duplicate", args); } else { - duplicateFilesErrorMessage = getBundleString("dataset.file.duplicate") + dupeFileNamesNew + getBundleString("dataset.file.skip"); + duplicateFilesErrorMessage = BundleUtil.getStringFromBundle("dataset.file.duplicate", args); } - if (warningMessage == null) { warningMessage = duplicateFilesErrorMessage; } else { - warningMessage = warningMessage.concat("; " + duplicateFilesErrorMessage); + warningMessage = warningMessage.concat(" " + duplicateFilesErrorMessage); } } @@ -2289,8 +2492,9 @@ private String duplicateFilenameCheck(FileMetadata fileMetadata) { return IngestUtil.duplicateFilenameCheck(fileMetadata, fileLabelsExisting); } - private Map checksumMapOld = null; // checksums of the files already in the dataset - private Map checksumMapNew = null; // checksums of the new files already uploaded + private Map checksumMapOld = null; // checksums of the files already in the dataset + private Map checksumMapNew = null; // checksums of the new files already uploaded + private Map fileAlreadyExists = null; private void initChecksumMap() { checksumMapOld = new HashMap<>(); @@ -2302,7 +2506,7 @@ private void initChecksumMap() { if (fm.getDataFile() != null && fm.getDataFile().getId() != null) { String chksum = fm.getDataFile().getChecksumValue(); if (chksum != null) { - checksumMapOld.put(chksum, 1); + checksumMapOld.put(chksum, fm.getDataFile()); } } @@ -2315,28 +2519,28 @@ private boolean isFileAlreadyInDataset(DataFile dataFile) { initChecksumMap(); } + if (fileAlreadyExists == null) { + fileAlreadyExists = new HashMap<>(); + } + + String chksum = dataFile.getChecksumValue(); + if(checksumMapOld.get(chksum) != null){ + fileAlreadyExists.put(dataFile, checksumMapOld.get(chksum)); + } + return chksum == null ? false : checksumMapOld.get(chksum) != null; } private boolean isFileAlreadyUploaded(DataFile dataFile) { + if (checksumMapNew == null) { checksumMapNew = new HashMap<>(); } + + return FileUtil.isFileAlreadyUploaded(dataFile, checksumMapNew, fileAlreadyExists); - String chksum = dataFile.getChecksumValue(); - - if (chksum == null) { - return false; - } - - if (checksumMapNew.get(chksum) != null) { - return true; - } - - checksumMapNew.put(chksum, 1); - return false; } diff --git a/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java b/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java index 90b5562e8fd..99730a3a024 100644 --- a/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java +++ b/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java @@ -472,6 +472,7 @@ public void setSelected(boolean selected) { this.selected = selected; } + @Transient private boolean restrictedUI; diff --git a/src/main/java/edu/harvard/iq/dataverse/LoginPage.java b/src/main/java/edu/harvard/iq/dataverse/LoginPage.java index 8067e1f150f..38376fa84c0 100644 --- a/src/main/java/edu/harvard/iq/dataverse/LoginPage.java +++ b/src/main/java/edu/harvard/iq/dataverse/LoginPage.java @@ -13,6 +13,8 @@ import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.BundleUtil; import edu.harvard.iq.dataverse.util.JsfHelper; +import edu.harvard.iq.dataverse.util.SessionUtil; + import static edu.harvard.iq.dataverse.util.JsfHelper.JH; import edu.harvard.iq.dataverse.util.SystemConfig; import java.io.UnsupportedEncodingException; @@ -29,6 +31,7 @@ import javax.faces.view.ViewScoped; import javax.inject.Inject; import javax.inject.Named; +import javax.servlet.http.HttpServletRequest; /** * @@ -169,7 +172,6 @@ public String login() { logger.log(Level.FINE, "User authenticated: {0}", r.getEmail()); session.setUser(r); session.configureSessionTimeout(); - if ("dataverse.xhtml".equals(redirectPage)) { redirectPage = redirectToRoot(); } diff --git a/src/main/java/edu/harvard/iq/dataverse/MailServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/MailServiceBean.java index 0397d795798..0ea36dfdc7a 100644 --- a/src/main/java/edu/harvard/iq/dataverse/MailServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/MailServiceBean.java @@ -481,6 +481,13 @@ public String getMessageTextBasedOnNotification(UserNotification userNotificatio version.getDataset().getOwner().getDisplayName(), getDataverseLink(version.getDataset().getOwner())}; messageText += MessageFormat.format(pattern, paramArrayPublishedDataset); return messageText; + case PUBLISHFAILED_PIDREG: + version = (DatasetVersion) targetObject; + pattern = BundleUtil.getStringFromBundle("notification.email.publishFailedPidReg"); + String[] paramArrayPublishFailedDatasetPidReg = {version.getDataset().getDisplayName(), getDatasetLink(version.getDataset()), + version.getDataset().getOwner().getDisplayName(), getDataverseLink(version.getDataset().getOwner())}; + messageText += MessageFormat.format(pattern, paramArrayPublishFailedDatasetPidReg); + return messageText; case RETURNEDDS: version = (DatasetVersion) targetObject; pattern = BundleUtil.getStringFromBundle("notification.email.wasReturnedByReviewer"); @@ -597,6 +604,7 @@ private Object getObjectOfNotification (UserNotification userNotification){ case CREATEDS: case SUBMITTEDDS: case PUBLISHEDDS: + case PUBLISHFAILED_PIDREG: case RETURNEDDS: return versionService.find(userNotification.getObjectId()); case CREATEACC: diff --git a/src/main/java/edu/harvard/iq/dataverse/UserNotification.java b/src/main/java/edu/harvard/iq/dataverse/UserNotification.java index 70b9cacf4e3..d404d82c950 100644 --- a/src/main/java/edu/harvard/iq/dataverse/UserNotification.java +++ b/src/main/java/edu/harvard/iq/dataverse/UserNotification.java @@ -27,7 +27,7 @@ public class UserNotification implements Serializable { public enum Type { - ASSIGNROLE, REVOKEROLE, CREATEDV, CREATEDS, CREATEACC, MAPLAYERUPDATED, MAPLAYERDELETEFAILED, SUBMITTEDDS, RETURNEDDS, PUBLISHEDDS, REQUESTFILEACCESS, GRANTFILEACCESS, REJECTFILEACCESS, FILESYSTEMIMPORT, CHECKSUMIMPORT, CHECKSUMFAIL, CONFIRMEMAIL, APIGENERATED, INGESTCOMPLETED, INGESTCOMPLETEDWITHERRORS + ASSIGNROLE, REVOKEROLE, CREATEDV, CREATEDS, CREATEACC, MAPLAYERUPDATED, MAPLAYERDELETEFAILED, SUBMITTEDDS, RETURNEDDS, PUBLISHEDDS, REQUESTFILEACCESS, GRANTFILEACCESS, REJECTFILEACCESS, FILESYSTEMIMPORT, CHECKSUMIMPORT, CHECKSUMFAIL, CONFIRMEMAIL, APIGENERATED, INGESTCOMPLETED, INGESTCOMPLETEDWITHERRORS, PUBLISHFAILED_PIDREG }; private static final long serialVersionUID = 1L; diff --git a/src/main/java/edu/harvard/iq/dataverse/UserNotificationServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/UserNotificationServiceBean.java index 76e79af3049..071805d3d26 100644 --- a/src/main/java/edu/harvard/iq/dataverse/UserNotificationServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/UserNotificationServiceBean.java @@ -13,6 +13,8 @@ import java.util.logging.Logger; import javax.ejb.EJB; import javax.ejb.Stateless; +import javax.ejb.TransactionAttribute; +import javax.ejb.TransactionAttributeType; import javax.inject.Named; import javax.persistence.EntityManager; import javax.persistence.PersistenceContext; @@ -83,6 +85,11 @@ public void delete(UserNotification userNotification) { em.remove(em.merge(userNotification)); } + @TransactionAttribute(TransactionAttributeType.REQUIRES_NEW) + public void sendNotificationInNewTransaction(AuthenticatedUser dataverseUser, Timestamp sendDate, Type type, Long objectId) { + sendNotification(dataverseUser, sendDate, type, objectId, ""); + } + public void sendNotification(AuthenticatedUser dataverseUser, Timestamp sendDate, Type type, Long objectId) { sendNotification(dataverseUser, sendDate, type, objectId, ""); } @@ -106,10 +113,9 @@ public void sendNotification(AuthenticatedUser dataverseUser, Timestamp sendDate if (mailService.sendNotificationEmail(userNotification, comment, requestor, isHtmlContent)) { logger.fine("email was sent"); userNotification.setEmailed(true); - save(userNotification); } else { logger.fine("email was not sent"); - save(userNotification); } + save(userNotification); } } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java b/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java index 94613100b16..2a82acc7622 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java @@ -670,6 +670,15 @@ protected Response ok( String msg ) { .type(MediaType.APPLICATION_JSON) .build(); } + + protected Response ok( String msg, JsonObjectBuilder bld ) { + return Response.ok().entity(Json.createObjectBuilder() + .add("status", STATUS_OK) + .add("message", Json.createObjectBuilder().add("message",msg)) + .add("data", bld).build()) + .type(MediaType.APPLICATION_JSON) + .build(); + } protected Response ok( boolean value ) { return Response.ok().entity(Json.createObjectBuilder() diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Admin.java b/src/main/java/edu/harvard/iq/dataverse/api/Admin.java index 9eb5989962c..11947876cc3 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Admin.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Admin.java @@ -72,6 +72,7 @@ import java.util.List; import edu.harvard.iq.dataverse.authorization.AuthTestDataServiceBean; +import edu.harvard.iq.dataverse.authorization.AuthenticationProvidersRegistrationServiceBean; import edu.harvard.iq.dataverse.authorization.RoleAssignee; import edu.harvard.iq.dataverse.authorization.UserRecordIdentifier; import edu.harvard.iq.dataverse.authorization.groups.impl.explicit.ExplicitGroupServiceBean; @@ -114,6 +115,8 @@ public class Admin extends AbstractApiBean { private static final Logger logger = Logger.getLogger(Admin.class.getName()); + @EJB + AuthenticationProvidersRegistrationServiceBean authProvidersRegistrationSvc; @EJB BuiltinUserServiceBean builtinUserService; @EJB @@ -223,9 +226,9 @@ public Response addProvider(AuthenticationProviderRow row) { managed = row; } if (managed.isEnabled()) { - AuthenticationProvider provider = authSvc.loadProvider(managed); - authSvc.deregisterProvider(provider.getId()); - authSvc.registerProvider(provider); + AuthenticationProvider provider = authProvidersRegistrationSvc.loadProvider(managed); + authProvidersRegistrationSvc.deregisterProvider(provider.getId()); + authProvidersRegistrationSvc.registerProvider(provider); } return created("/api/admin/authenticationProviders/" + managed.getId(), json(managed)); } catch (AuthorizationSetupException e) { @@ -271,7 +274,7 @@ public Response enableAuthenticationProvider(@PathParam("id") String id, String return ok(String.format("Authentication provider '%s' already enabled", id)); } try { - authSvc.registerProvider(authSvc.loadProvider(row)); + authProvidersRegistrationSvc.registerProvider(authProvidersRegistrationSvc.loadProvider(row)); return ok(String.format("Authentication Provider %s enabled", row.getId())); } catch (AuthenticationProviderFactoryNotFoundException ex) { @@ -285,7 +288,7 @@ public Response enableAuthenticationProvider(@PathParam("id") String id, String } else { // disable a provider - authSvc.deregisterProvider(id); + authProvidersRegistrationSvc.deregisterProvider(id); return ok("Authentication Provider '" + id + "' disabled. " + (authSvc.getAuthenticationProviderIds().isEmpty() ? "WARNING: no enabled authentication providers left." @@ -309,7 +312,7 @@ public Response checkAuthenticationProviderEnabled(@PathParam("id") String id) { @DELETE @Path("authenticationProviders/{id}/") public Response deleteAuthenticationProvider(@PathParam("id") String id) { - authSvc.deregisterProvider(id); + authProvidersRegistrationSvc.deregisterProvider(id); AuthenticationProviderRow row = em.find(AuthenticationProviderRow.class, id); if (row != null) { em.remove(row); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java index 61844ab6b9c..0c25fc4403f 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java @@ -1659,7 +1659,13 @@ public Response addFileToDataset(@PathParam("id") String idSupplied, * user. Human readable. */ logger.fine("successMsg: " + successMsg); - return ok(addFileHelper.getSuccessResultAsJsonObjectBuilder()); + String duplicateWarning = addFileHelper.getDuplicateFileWarning(); + if (duplicateWarning != null && !duplicateWarning.isEmpty()) { + return ok(addFileHelper.getDuplicateFileWarning(), addFileHelper.getSuccessResultAsJsonObjectBuilder()); + } else { + return ok(addFileHelper.getSuccessResultAsJsonObjectBuilder()); + } + //"Look at that! You added a file! (hey hey, it may have worked)"); } catch (NoFilesException ex) { Logger.getLogger(Files.class.getName()).log(Level.SEVERE, null, ex); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Files.java b/src/main/java/edu/harvard/iq/dataverse/api/Files.java index 7759761f35e..81db7f9dec1 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Files.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Files.java @@ -372,15 +372,9 @@ public Response updateFileMetadata(@FormDataParam("jsonData") String jsonData, List fmdList = editVersion.getFileMetadatas(); for(FileMetadata testFmd : fmdList) { DataFile daf = testFmd.getDataFile(); - // Not sure I understand why we are comparing the checksum values here, - // and not the DataFile ids. (probably because this code was - // copy-and-pasted from somewhere where it was potentially operating - // on *new* datafiles, that haven't been saved in the database yet; - // but it should never be the case in the context of this API) - // -- L.A. Mar. 2020 - if(daf.getChecksumType().equals(df.getChecksumType()) - && daf.getChecksumValue().equals(df.getChecksumValue())) { - upFmd = testFmd; + if(daf.equals(df)){ + upFmd = testFmd; + break; } } @@ -413,7 +407,7 @@ public Response updateFileMetadata(@FormDataParam("jsonData") String jsonData, } catch (Exception e) { logger.log(Level.WARNING, "Dataset publication finalization: exception while exporting:{0}", e); - return error(Response.Status.INTERNAL_SERVER_ERROR, "Error adding metadata to DataFile" + e); + return error(Response.Status.INTERNAL_SERVER_ERROR, "Error adding metadata to DataFile: " + e); } } catch (WrappedResponse wr) { diff --git a/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationProvidersRegistrationServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationProvidersRegistrationServiceBean.java new file mode 100644 index 00000000000..6289865baf0 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationProvidersRegistrationServiceBean.java @@ -0,0 +1,269 @@ +/* + * To change this license header, choose License Headers in Project Properties. + * To change this template file, choose Tools | Templates + * and open the template in the editor. + */ +package edu.harvard.iq.dataverse.authorization; + +import edu.harvard.iq.dataverse.actionlogging.ActionLogRecord; +import edu.harvard.iq.dataverse.actionlogging.ActionLogServiceBean; +import edu.harvard.iq.dataverse.authorization.exceptions.AuthenticationProviderFactoryNotFoundException; +import edu.harvard.iq.dataverse.authorization.exceptions.AuthorizationSetupException; +import edu.harvard.iq.dataverse.authorization.providers.AuthenticationProviderFactory; +import edu.harvard.iq.dataverse.authorization.providers.AuthenticationProviderRow; +import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinAuthenticationProviderFactory; +import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean; +import edu.harvard.iq.dataverse.authorization.providers.oauth2.AbstractOAuth2AuthenticationProvider; +import edu.harvard.iq.dataverse.authorization.providers.oauth2.OAuth2AuthenticationProviderFactory; +import edu.harvard.iq.dataverse.authorization.providers.oauth2.oidc.OIDCAuthenticationProviderFactory; +import edu.harvard.iq.dataverse.authorization.providers.shib.ShibAuthenticationProviderFactory; +import edu.harvard.iq.dataverse.validation.PasswordValidatorServiceBean; +import java.util.HashMap; +import java.util.Map; +import java.util.logging.Level; +import java.util.logging.Logger; +import javax.annotation.PostConstruct; +import javax.ejb.EJB; +import javax.ejb.Lock; +import static javax.ejb.LockType.READ; +import static javax.ejb.LockType.WRITE; +import javax.ejb.Singleton; +import javax.inject.Named; +import javax.persistence.EntityManager; +import javax.persistence.PersistenceContext; + +/** + * + * @author Leonid Andreev + */ +/** + * The AuthenticationProvidersRegistrationService is responsible for registering and listing + * AuthenticationProviders. There's a single instance per application. + * + * Register the providers in the {@link #startup()} method. + */ +@Named +@Lock(READ) +@Singleton +public class AuthenticationProvidersRegistrationServiceBean { + + private static final Logger logger = Logger.getLogger(AuthenticationProvidersRegistrationServiceBean.class.getName()); + + @EJB + BuiltinUserServiceBean builtinUserServiceBean; + + @EJB + PasswordValidatorServiceBean passwordValidatorService; + + @EJB + protected ActionLogServiceBean actionLogSvc; + + @EJB + AuthenticationServiceBean authenticationService; + + /** + * The maps below (the objects themselves) are "final", but the + * values will be populated in @PostConstruct (see below) during + * the initialization and in later calls to the service. + * This is a @Singleton, so we are guaranteed that there is only + * one application-wide copy of each of these maps. + */ + + /** + * Authentication Provider Factories: + */ + final Map providerFactories = new HashMap<>(); + + /** + * Where all registered authentication providers live. + */ + final Map authenticationProviders = new HashMap<>(); + + /** + * Index of all OAuth2 providers mapped to {@link #authenticationProviders}. + */ + final Map oAuth2authenticationProviders = new HashMap<>(); + + @PersistenceContext(unitName = "VDCNet-ejbPU") + private EntityManager em; + + // does this method also need an explicit @Lock(WRITE)? + // - I'm assuming not; since it's guaranteed to only be called once, + // via @PostConstruct in this @Singleton. -- L.A. + @PostConstruct + public void startup() { + + // First, set up the factories + try { + // @todo: Instead of hard-coding the factories here, consider + // using @AutoService - similiarly how we are using with the + // metadata Exporter classes. (may not necessarily be possible, or + // easy; hence "consider" -- L.A.) + registerProviderFactory( new BuiltinAuthenticationProviderFactory(builtinUserServiceBean, passwordValidatorService, authenticationService) ); + registerProviderFactory( new ShibAuthenticationProviderFactory() ); + registerProviderFactory( new OAuth2AuthenticationProviderFactory() ); + registerProviderFactory( new OIDCAuthenticationProviderFactory() ); + + } catch (AuthorizationSetupException ex) { + logger.log(Level.SEVERE, "Exception setting up the authentication provider factories: " + ex.getMessage(), ex); + } + + // Now, load the providers. + em.createNamedQuery("AuthenticationProviderRow.findAllEnabled", AuthenticationProviderRow.class) + .getResultList().forEach((row) -> { + try { + registerProvider( loadProvider(row) ); + + } catch ( AuthenticationProviderFactoryNotFoundException e ) { + logger.log(Level.SEVERE, "Cannot find authentication provider factory with alias '" + e.getFactoryAlias() + "'",e); + + } catch (AuthorizationSetupException ex) { + logger.log(Level.SEVERE, "Exception setting up the authentication provider '" + row.getId() + "': " + ex.getMessage(), ex); + } + }); + } + + private void registerProviderFactory(AuthenticationProviderFactory aFactory) + throws AuthorizationSetupException + { + if ( providerFactories.containsKey(aFactory.getAlias()) ) { + throw new AuthorizationSetupException( + "Duplicate alias " + aFactory.getAlias() + " for authentication provider factory."); + } + providerFactories.put( aFactory.getAlias(), aFactory); + logger.log( Level.FINE, "Registered Authentication Provider Factory {0} as {1}", + new Object[]{aFactory.getInfo(), aFactory.getAlias()}); + } + + /** + * Tries to load and {@link AuthenticationProvider} using the passed {@link AuthenticationProviderRow}. + * @param aRow The row to load the provider from. + * @return The provider, if successful + * @throws AuthenticationProviderFactoryNotFoundException If the row specifies a non-existent factory + * @throws AuthorizationSetupException If the factory failed to instantiate a provider from the row. + */ + @Lock(WRITE) + public AuthenticationProvider loadProvider( AuthenticationProviderRow aRow ) + throws AuthenticationProviderFactoryNotFoundException, AuthorizationSetupException { + AuthenticationProviderFactory fact = providerFactories.get((aRow.getFactoryAlias())); + + if ( fact == null ) throw new AuthenticationProviderFactoryNotFoundException(aRow.getFactoryAlias()); + + return fact.buildProvider(aRow); + } + + @Lock(WRITE) + public void registerProvider(AuthenticationProvider aProvider) throws AuthorizationSetupException { + if ( authenticationProviders.containsKey(aProvider.getId()) ) { + throw new AuthorizationSetupException( + "Duplicate id " + aProvider.getId() + " for authentication provider."); + } + authenticationProviders.put( aProvider.getId(), aProvider); + actionLogSvc.log( new ActionLogRecord(ActionLogRecord.ActionType.Auth, "registerProvider") + .setInfo(aProvider.getId() + ":" + aProvider.getInfo().getTitle())); + if ( aProvider instanceof AbstractOAuth2AuthenticationProvider ) { + oAuth2authenticationProviders.put(aProvider.getId(), (AbstractOAuth2AuthenticationProvider) aProvider); + } + } + + @Lock(READ) + public Map getOAuth2AuthProvidersMap() { + return oAuth2authenticationProviders; + } + + /* + the commented-out methods below were moved into this service in + the quick patch produced for 4.20; but have been modified and moved + back into AuthenticationServiceBean again for v5.0. -- L.A. + + @Lock(READ) + public AbstractOAuth2AuthenticationProvider getOAuth2Provider( String id ) { + return oAuth2authenticationProviders.get(id); + } + + @Lock(READ) + public Set getOAuth2Providers() { + return new HashSet<>(oAuth2authenticationProviders.values()); + }*/ + + @Lock(READ) + public Map getAuthenticationProvidersMap() { + return authenticationProviders; + } + + @Lock(WRITE) + public void deregisterProvider( String id ) { + oAuth2authenticationProviders.remove( id ); + if ( authenticationProviders.remove(id) != null ) { + actionLogSvc.log( new ActionLogRecord(ActionLogRecord.ActionType.Auth, "deregisterProvider") + .setInfo(id)); + logger.log(Level.INFO,"Deregistered provider {0}", new Object[]{id}); + logger.log(Level.INFO,"Providers left {0}", new Object[]{authenticationProviders.values()}); + } + } + + /* + @Lock(READ) + public Set getAuthenticationProviderIds() { + return authenticationProviders.keySet(); + } + + @Lock(READ) + public Collection getAuthenticationProviders() { + return authenticationProviders.values(); + } + + @Lock(READ) + public Set getAuthenticationProviderIdsOfType( Class aClass ) { + // @todo: remove this! + //logger.info("inside getAuthenticationProviderIdsOfType and sleeping for 20 seconds"); + //try { + // Thread.sleep(20000); + //} catch (Exception ex) { + // logger.warning("Failed to sleep for 20 seconds."); + //} + Set retVal = new TreeSet<>(); + for ( Map.Entry p : authenticationProviders.entrySet() ) { + if ( aClass.isAssignableFrom( p.getValue().getClass() ) ) { + retVal.add( p.getKey() ); + } + } + //logger.info("done with getAuthenticationProviderIdsOfType."); + return retVal; + } + */ + + @Lock(READ) + public Map getProviderFactoriesMap() { + return providerFactories; + } + + /* + @Lock(READ) + public AuthenticationProviderFactory getProviderFactory( String alias ) { + return providerFactories.get(alias); + } + + @Lock(READ) + public AuthenticationProvider getAuthenticationProvider( String id ) { + //logger.info("inside getAuthenticationProvider()"); + return authenticationProviders.get( id ); + } + + @Lock(READ) + public AuthenticationProvider lookupProvider( AuthenticatedUser user ) { + return authenticationProviders.get(user.getAuthenticatedUserLookup().getAuthenticationProviderId()); + } + + @Lock(READ) + public Set listProviderFactories() { + return new HashSet<>( providerFactories.values() ); + } + + @Lock(READ) + public boolean isOrcidEnabled() { + return oAuth2authenticationProviders.values().stream().anyMatch( s -> s.getId().toLowerCase().contains("orcid") ); + } + */ + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationServiceBean.java index 5e12eab54e8..593e2ef1e0e 100644 --- a/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/authorization/AuthenticationServiceBean.java @@ -53,7 +53,7 @@ import javax.annotation.PostConstruct; import javax.ejb.EJB; import javax.ejb.EJBException; -import javax.ejb.Singleton; +import javax.ejb.Stateless; import javax.inject.Named; import javax.persistence.EntityManager; import javax.persistence.NoResultException; @@ -67,27 +67,20 @@ import javax.validation.ValidatorFactory; /** - * The AuthenticationManager is responsible for registering and listing - * AuthenticationProviders. There's a single instance per application. + * AuthenticationService is for general authentication-related operations. + * It's no longer responsible for registering and listing + * AuthenticationProviders! A dedicated singleton has been created for that + * purpose - AuthenticationProvidersRegistrationServiceBean - and all the + * related code has been moved there. * - * Register the providers in the {@link #startup()} method. */ @Named -@Singleton +@Stateless public class AuthenticationServiceBean { private static final Logger logger = Logger.getLogger(AuthenticationServiceBean.class.getName()); - /** - * Where all registered authentication providers live. - */ - final Map authenticationProviders = new HashMap<>(); - - /** - * Index of all OAuth2 providers. They also live in {@link #authenticationProviders}. - */ - final Map oAuth2authenticationProviders = new HashMap<>(); - - final Map providerFactories = new HashMap<>(); + @EJB + AuthenticationProvidersRegistrationServiceBean authProvidersRegistrationService; @EJB BuiltinUserServiceBean builtinUserServiceBean; @@ -130,107 +123,27 @@ public class AuthenticationServiceBean { @PersistenceContext(unitName = "VDCNet-ejbPU") private EntityManager em; - - @PostConstruct - public void startup() { - // First, set up the factories - try { - registerProviderFactory( new BuiltinAuthenticationProviderFactory(builtinUserServiceBean, passwordValidatorService, this) ); - registerProviderFactory( new ShibAuthenticationProviderFactory() ); - registerProviderFactory( new OAuth2AuthenticationProviderFactory() ); - registerProviderFactory( new OIDCAuthenticationProviderFactory() ); - - } catch (AuthorizationSetupException ex) { - logger.log(Level.SEVERE, "Exception setting up the authentication provider factories: " + ex.getMessage(), ex); - } - // Now, load the providers. - em.createNamedQuery("AuthenticationProviderRow.findAllEnabled", AuthenticationProviderRow.class) - .getResultList().forEach((row) -> { - try { - registerProvider( loadProvider(row) ); - - } catch ( AuthenticationProviderFactoryNotFoundException e ) { - logger.log(Level.SEVERE, "Cannot find authentication provider factory with alias '" + e.getFactoryAlias() + "'",e); - - } catch (AuthorizationSetupException ex) { - logger.log(Level.SEVERE, "Exception setting up the authentication provider '" + row.getId() + "': " + ex.getMessage(), ex); - } - }); - } - - public void registerProviderFactory(AuthenticationProviderFactory aFactory) - throws AuthorizationSetupException - { - if ( providerFactories.containsKey(aFactory.getAlias()) ) { - throw new AuthorizationSetupException( - "Duplicate alias " + aFactory.getAlias() + " for authentication provider factory."); - } - providerFactories.put( aFactory.getAlias(), aFactory); - logger.log( Level.FINE, "Registered Authentication Provider Factory {0} as {1}", - new Object[]{aFactory.getInfo(), aFactory.getAlias()}); - } - - /** - * Tries to load and {@link AuthenticationProvider} using the passed {@link AuthenticationProviderRow}. - * @param aRow The row to load the provider from. - * @return The provider, if successful - * @throws AuthenticationProviderFactoryNotFoundException If the row specifies a non-existent factory - * @throws AuthorizationSetupException If the factory failed to instantiate a provider from the row. - */ - public AuthenticationProvider loadProvider( AuthenticationProviderRow aRow ) - throws AuthenticationProviderFactoryNotFoundException, AuthorizationSetupException { - AuthenticationProviderFactory fact = getProviderFactory(aRow.getFactoryAlias()); - - if ( fact == null ) throw new AuthenticationProviderFactoryNotFoundException(aRow.getFactoryAlias()); - - return fact.buildProvider(aRow); - } - - public void registerProvider(AuthenticationProvider aProvider) throws AuthorizationSetupException { - if ( authenticationProviders.containsKey(aProvider.getId()) ) { - throw new AuthorizationSetupException( - "Duplicate id " + aProvider.getId() + " for authentication provider."); - } - authenticationProviders.put( aProvider.getId(), aProvider); - actionLogSvc.log( new ActionLogRecord(ActionLogRecord.ActionType.Auth, "registerProvider") - .setInfo(aProvider.getId() + ":" + aProvider.getInfo().getTitle())); - if ( aProvider instanceof AbstractOAuth2AuthenticationProvider ) { - oAuth2authenticationProviders.put(aProvider.getId(), (AbstractOAuth2AuthenticationProvider) aProvider); - } - - } - public AbstractOAuth2AuthenticationProvider getOAuth2Provider( String id ) { - return oAuth2authenticationProviders.get(id); + return authProvidersRegistrationService.getOAuth2AuthProvidersMap().get(id); } public Set getOAuth2Providers() { - return new HashSet<>(oAuth2authenticationProviders.values()); - } - - public void deregisterProvider( String id ) { - oAuth2authenticationProviders.remove( id ); - if ( authenticationProviders.remove(id) != null ) { - actionLogSvc.log( new ActionLogRecord(ActionLogRecord.ActionType.Auth, "deregisterProvider") - .setInfo(id)); - logger.log(Level.INFO,"Deregistered provider {0}", new Object[]{id}); - logger.log(Level.INFO,"Providers left {0}", new Object[]{getAuthenticationProviderIds()}); - } + return new HashSet<>(authProvidersRegistrationService.getOAuth2AuthProvidersMap().values()); } public Set getAuthenticationProviderIds() { - return authenticationProviders.keySet(); + return authProvidersRegistrationService.getAuthenticationProvidersMap().keySet(); } public Collection getAuthenticationProviders() { - return authenticationProviders.values(); + return authProvidersRegistrationService.getAuthenticationProvidersMap().values(); } public Set getAuthenticationProviderIdsOfType( Class aClass ) { Set retVal = new TreeSet<>(); - for ( Map.Entry p : authenticationProviders.entrySet() ) { + for ( Map.Entry p : authProvidersRegistrationService.getAuthenticationProvidersMap().entrySet() ) { if ( aClass.isAssignableFrom( p.getValue().getClass() ) ) { retVal.add( p.getKey() ); } @@ -239,11 +152,11 @@ public Set getAuthenticationProviderI } public AuthenticationProviderFactory getProviderFactory( String alias ) { - return providerFactories.get(alias); + return authProvidersRegistrationService.getProviderFactoriesMap().get(alias); } public AuthenticationProvider getAuthenticationProvider( String id ) { - return authenticationProviders.get( id ); + return authProvidersRegistrationService.getAuthenticationProvidersMap().get( id ); } public AuthenticatedUser findByID(Object pk){ @@ -263,7 +176,7 @@ public void removeApiToken(AuthenticatedUser user){ } public boolean isOrcidEnabled() { - return oAuth2authenticationProviders.values().stream().anyMatch( s -> s.getId().toLowerCase().contains("orcid") ); + return authProvidersRegistrationService.getOAuth2AuthProvidersMap().values().stream().anyMatch( s -> s.getId().toLowerCase().contains("orcid") ); } /** @@ -437,7 +350,7 @@ public AuthenticatedUser lookupUser(String authPrvId, String userPersistentId) { } public AuthenticationProvider lookupProvider( AuthenticatedUser user ) { - return authenticationProviders.get(user.getAuthenticatedUserLookup().getAuthenticationProviderId()); + return authProvidersRegistrationService.getAuthenticationProvidersMap().get(user.getAuthenticatedUserLookup().getAuthenticationProviderId()); } public ApiToken findApiToken(String token) { @@ -739,7 +652,7 @@ public List findSuperUsers() { public Set listProviderFactories() { - return new HashSet<>( providerFactories.values() ); + return new HashSet<>( authProvidersRegistrationService.getProviderFactoriesMap().values() ); } public Timestamp getCurrentTimestamp() { diff --git a/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/DataverseUserPage.java b/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/DataverseUserPage.java index c66085f9976..e50d38131c9 100644 --- a/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/DataverseUserPage.java +++ b/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/DataverseUserPage.java @@ -486,6 +486,7 @@ public void displayNotification() { case CREATEDS: case SUBMITTEDDS: case PUBLISHEDDS: + case PUBLISHFAILED_PIDREG: case RETURNEDDS: userNotification.setTheObject(datasetVersionService.find(userNotification.getObjectId())); break; diff --git a/src/main/java/edu/harvard/iq/dataverse/datasetutility/AddReplaceFileHelper.java b/src/main/java/edu/harvard/iq/dataverse/datasetutility/AddReplaceFileHelper.java index 18bf172f5d3..4928100dfff 100644 --- a/src/main/java/edu/harvard/iq/dataverse/datasetutility/AddReplaceFileHelper.java +++ b/src/main/java/edu/harvard/iq/dataverse/datasetutility/AddReplaceFileHelper.java @@ -153,6 +153,55 @@ public class AddReplaceFileHelper{ private boolean contentTypeWarningFound; private String contentTypeWarningString; + private boolean duplicateFileErrorFound; + + private String duplicateFileErrorString; + + private boolean duplicateFileWarningFound; + private String duplicateFileWarningString; + + private String duplicateFileComponentMessage; + + public String getDuplicateFileComponentMessage() { + return duplicateFileComponentMessage; + } + + public void setDuplicateFileComponentMessage(String duplicateFileComponentMessage) { + this.duplicateFileComponentMessage = duplicateFileComponentMessage; + } + + public boolean isDuplicateFileErrorFound() { + return duplicateFileErrorFound; + } + + public void setDuplicateFileErrorFound(boolean duplicateFileErrorFound) { + this.duplicateFileErrorFound = duplicateFileErrorFound; + } + + public String getDuplicateFileErrorString() { + return duplicateFileErrorString; + } + + public void setDuplicateFileErrorString(String duplicateFileErrorString) { + this.duplicateFileErrorString = duplicateFileErrorString; + } + + public boolean isDuplicateFileWarningFound() { + return duplicateFileWarningFound; + } + + public void setDuplicateFileWarningFound(boolean duplicateFileWarningFound) { + this.duplicateFileWarningFound = duplicateFileWarningFound; + } + + public String getDuplicateFileWarningString() { + return duplicateFileWarningString; + } + + public void setDuplicateFileWarningString(String duplicateFileWarningString) { + this.duplicateFileWarningString = duplicateFileWarningString; + } + public void resetFileHelper(){ initErrorHandling(); @@ -761,6 +810,17 @@ private void addError(Response.Status badHttpResponse, String errMsg){ } + private void addErrorWarning(String errMsg){ + if (errMsg == null){ + throw new NullPointerException("errMsg cannot be null"); + } + + logger.severe(errMsg); + this.setDuplicateFileWarning(errMsg); + this.errorMessages.add(errMsg); + + } + private void addErrorSevere(String errMsg){ @@ -1134,6 +1194,8 @@ private boolean step_030_createNewFilesViaIngest(){ * @return */ private boolean step_040_auto_checkForDuplicates(){ + this.duplicateFileErrorString = ""; + this.duplicateFileErrorFound = false; msgt("step_040_auto_checkForDuplicates"); if (this.hasError()){ @@ -1179,20 +1241,24 @@ private boolean step_040_auto_checkForDuplicates(){ // ----------------------------------------------------------- // (2) Check for duplicates + // Only a warning now // ----------------------------------------------------------- if (isFileReplaceOperation() && Objects.equals(df.getChecksumValue(), fileToReplace.getChecksumValue())){ - this.addErrorSevere(getBundleErr("replace.new_file_same_as_replacement")); + this.addError(getBundleErr("replace.new_file_same_as_replacement")); + this.duplicateFileErrorFound = true; + this.duplicateFileErrorString = getBundleErr("replace.new_file_same_as_replacement"); break; - } else if (DuplicateFileChecker.isDuplicateOriginalWay(workingVersion, df.getFileMetadata())){ + } + + if (DuplicateFileChecker.isDuplicateOriginalWay(workingVersion, df.getFileMetadata())){ String dupeName = df.getFileMetadata().getLabel(); - //removeUnSavedFilesFromWorkingVersion(); - //removeLinkedFileFromDataset(dataset, df); - //abandonOperationRemoveAllNewFilesFromDataset(); - this.addErrorSevere(getBundleErr("duplicate_file") + " " + dupeName); - //return false; - } else { - finalFileList.add(df); - } + this.duplicateFileWarningFound = true; + this.duplicateFileWarningString = BundleUtil.getStringFromBundle("file.addreplace.warning.duplicate_file", + Arrays.asList(dupeName)); + this.addErrorWarning(this.duplicateFileWarningString); + + } + finalFileList.add(df); } if (this.hasError()){ @@ -1913,6 +1979,16 @@ public String getContentTypeWarningString(){ return contentTypeWarningString; } + private String duplicateFileWarning; + + public String getDuplicateFileWarning() { + return duplicateFileWarning; + } + + public void setDuplicateFileWarning(String duplicateFileWarning) { + this.duplicateFileWarning = duplicateFileWarning; + } + } // end class /* DatasetPage sequence: diff --git a/src/main/java/edu/harvard/iq/dataverse/datasetutility/DuplicateFileChecker.java b/src/main/java/edu/harvard/iq/dataverse/datasetutility/DuplicateFileChecker.java index a709f07b5b7..dd55ec72213 100644 --- a/src/main/java/edu/harvard/iq/dataverse/datasetutility/DuplicateFileChecker.java +++ b/src/main/java/edu/harvard/iq/dataverse/datasetutility/DuplicateFileChecker.java @@ -9,7 +9,9 @@ import edu.harvard.iq.dataverse.DatasetVersion; import edu.harvard.iq.dataverse.DatasetVersionServiceBean; import edu.harvard.iq.dataverse.FileMetadata; +import edu.harvard.iq.dataverse.util.BundleUtil; import java.util.ArrayList; +import java.util.Arrays; import java.util.HashMap; import java.util.Iterator; import java.util.List; @@ -150,18 +152,27 @@ public static boolean isDuplicateOriginalWay(DatasetVersion workingVersion, File List wvCopy = new ArrayList<>(workingVersion.getFileMetadatas()); Iterator fmIt = wvCopy.iterator(); - while (fmIt.hasNext()) { + while (fmIt.hasNext()) { FileMetadata fm = fmIt.next(); - String currentCheckSum = fm.getDataFile().getChecksumValue(); + String currentCheckSum = fm.getDataFile().getChecksumValue(); if (currentCheckSum != null) { + if (currentCheckSum.equals(selectedCheckSum)) { + DataFile existingFile = fm.getDataFile(); + List args = Arrays.asList(existingFile.getDisplayName()); + String inLineMessage = BundleUtil.getStringFromBundle("dataset.file.inline.message", args); + fileMetadata.getDataFile().setDuplicateFilename(inLineMessage); + return true; + } + /* if (checkSumMap.get(currentCheckSum) != null) { checkSumMap.put(currentCheckSum, checkSumMap.get(currentCheckSum).intValue() + 1); } else { checkSumMap.put(currentCheckSum, 1); - } + }*/ } } - return checkSumMap.get(selectedCheckSum) != null; // && checkSumMap.get(selectedCheckSum).intValue() > 1; + return false; + // return checkSumMap.get(selectedCheckSum) != null; // && checkSumMap.get(selectedCheckSum).intValue() > 1; } diff --git a/src/main/java/edu/harvard/iq/dataverse/datasetutility/FileReplacePageHelper.java b/src/main/java/edu/harvard/iq/dataverse/datasetutility/FileReplacePageHelper.java index d79c4a48094..93bd903130c 100644 --- a/src/main/java/edu/harvard/iq/dataverse/datasetutility/FileReplacePageHelper.java +++ b/src/main/java/edu/harvard/iq/dataverse/datasetutility/FileReplacePageHelper.java @@ -199,6 +199,10 @@ public List getNewFileMetadatasBeforeSave(){ } + public AddReplaceFileHelper getAddReplaceFileHelper(){ + return replaceFileHelper; + } + /** * * Show file upload component if Phase 1 hasn't happened yet diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AbstractDatasetCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AbstractDatasetCommand.java index 12d02370dde..9d9ad548457 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AbstractDatasetCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AbstractDatasetCommand.java @@ -155,32 +155,30 @@ protected void tidyUpFields(DatasetVersion dsv) { * @param ctxt * @throws CommandException */ - protected void registerExternalIdentifier(Dataset theDataset, CommandContext ctxt) throws CommandException { + protected void registerExternalIdentifier(Dataset theDataset, CommandContext ctxt, boolean retry) throws CommandException { if (!theDataset.isIdentifierRegistered()) { GlobalIdServiceBean globalIdServiceBean = GlobalIdServiceBean.getBean(theDataset.getProtocol(), ctxt); if ( globalIdServiceBean != null ) { if (globalIdServiceBean instanceof FakePidProviderServiceBean) { - try { - globalIdServiceBean.createIdentifier(theDataset); - } catch (Throwable ex) { - logger.warning("Problem running createIdentifier for FakePidProvider: " + ex); - } - theDataset.setGlobalIdCreateTime(getTimestamp()); - theDataset.setIdentifierRegistered(true); - return; + retry=false; //No reason to allow a retry with the FakeProvider, so set false for efficiency } try { if (globalIdServiceBean.alreadyExists(theDataset)) { int attempts = 0; - - while (globalIdServiceBean.alreadyExists(theDataset) && attempts < FOOLPROOF_RETRIAL_ATTEMPTS_LIMIT) { - theDataset.setIdentifier(ctxt.datasets().generateDatasetIdentifier(theDataset, globalIdServiceBean)); - logger.log(Level.INFO, "Attempting to register external identifier for dataset {0} (trying: {1}).", - new Object[]{theDataset.getId(), theDataset.getIdentifier()}); - attempts++; + if(retry) { + do { + theDataset.setIdentifier(ctxt.datasets().generateDatasetIdentifier(theDataset, globalIdServiceBean)); + logger.log(Level.INFO, "Attempting to register external identifier for dataset {0} (trying: {1}).", + new Object[]{theDataset.getId(), theDataset.getIdentifier()}); + attempts++; + } while (globalIdServiceBean.alreadyExists(theDataset) && attempts <= FOOLPROOF_RETRIAL_ATTEMPTS_LIMIT); } - - if (globalIdServiceBean.alreadyExists(theDataset)) { + if(!retry) { + logger.warning("Reserving PID for: " + getDataset().getId() + " during publication failed."); + throw new IllegalCommandException(BundleUtil.getStringFromBundle("publishDatasetCommand.pidNotReserved"), this); + } + if(attempts > FOOLPROOF_RETRIAL_ATTEMPTS_LIMIT) { + //Didn't work - we existed the loop with too many tries throw new CommandExecutionException("This dataset may not be published because its identifier is already in use by another dataset; " + "gave up after " + attempts + " attempts. Current (last requested) identifier: " + theDataset.getIdentifier(), this); } diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateNewDatasetCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateNewDatasetCommand.java index e97eeb47ab3..722e0e6aba6 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateNewDatasetCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateNewDatasetCommand.java @@ -87,7 +87,7 @@ protected void handlePid(Dataset theDataset, CommandContext ctxt) throws Command GlobalIdServiceBean idServiceBean = GlobalIdServiceBean.getBean(ctxt); if ( !idServiceBean.registerWhenPublished() ) { // pre-register a persistent id - registerExternalIdentifier(theDataset, ctxt); + registerExternalIdentifier(theDataset, ctxt, true); } } diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/FinalizeDatasetPublicationCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/FinalizeDatasetPublicationCommand.java index 1edb5b44de4..edf4259ad88 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/FinalizeDatasetPublicationCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/FinalizeDatasetPublicationCommand.java @@ -1,5 +1,6 @@ package edu.harvard.iq.dataverse.engine.command.impl; +import edu.harvard.iq.dataverse.ControlledVocabularyValue; import edu.harvard.iq.dataverse.DataFile; import edu.harvard.iq.dataverse.Dataset; import edu.harvard.iq.dataverse.DatasetField; @@ -16,7 +17,6 @@ import edu.harvard.iq.dataverse.engine.command.DataverseRequest; import edu.harvard.iq.dataverse.engine.command.RequiredPermissions; import edu.harvard.iq.dataverse.engine.command.exception.CommandException; -import edu.harvard.iq.dataverse.export.ExportException; import edu.harvard.iq.dataverse.export.ExportService; import edu.harvard.iq.dataverse.privateurl.PrivateUrl; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; @@ -30,14 +30,9 @@ import java.util.logging.Logger; import edu.harvard.iq.dataverse.GlobalIdServiceBean; import edu.harvard.iq.dataverse.batch.util.LoggingUtil; -import edu.harvard.iq.dataverse.dataaccess.DataAccessOption; -import edu.harvard.iq.dataverse.dataaccess.StorageIO; import edu.harvard.iq.dataverse.engine.command.Command; import edu.harvard.iq.dataverse.util.FileUtil; -import java.io.InputStream; -import java.util.Arrays; import java.util.concurrent.Future; -import org.apache.commons.io.IOUtils; import org.apache.solr.client.solrj.SolrServerException; @@ -83,9 +78,26 @@ public Dataset execute(CommandContext ctxt) throws CommandException { validateDataFiles(theDataset, ctxt); // (this will throw a CommandException if it fails) } - + + /* + * Try to register the dataset identifier. For PID providers that have registerWhenPublished == false (all except the FAKE provider at present) + * the registerExternalIdentifier command will make one try to create the identifier if needed (e.g. if reserving at dataset creation wasn't done/failed). + * For registerWhenPublished == true providers, if a PID conflict is found, the call will retry with new PIDs. + */ if ( theDataset.getGlobalIdCreateTime() == null ) { - registerExternalIdentifier(theDataset, ctxt); + try { + // This can potentially throw a CommandException, so let's make + // sure we exit cleanly: + + registerExternalIdentifier(theDataset, ctxt, false); + } catch (CommandException comEx) { + // Send failure notification to the user: + notifyUsersDatasetPublishStatus(ctxt, theDataset, UserNotification.Type.PUBLISHFAILED_PIDREG); + // Remove the dataset lock: + ctxt.datasets().removeDatasetLocks(theDataset, DatasetLock.Reason.finalizePublication); + // re-throw the exception: + throw comEx; + } } // is this the first publication of the dataset? @@ -152,7 +164,11 @@ public Dataset execute(CommandContext ctxt) throws CommandException { if (!datasetExternallyReleased) { publicizeExternalIdentifier(theDataset, ctxt); - // (will throw a CommandException, unless successful) + // Will throw a CommandException, unless successful. + // This will end the execution of the command, but the method + // above takes proper care to "clean up after itself" in case of + // a failure - it will remove any locks, and it will send a + // proper notification to the user(s). } theDataset.getLatestVersion().setVersionState(RELEASED); } @@ -179,7 +195,8 @@ public Dataset execute(CommandContext ctxt) throws CommandException { Dataset readyDataset = ctxt.em().merge(theDataset); if ( readyDataset != null ) { - notifyUsersDatasetPublish(ctxt, theDataset); + // Success! - send notification: + notifyUsersDatasetPublishStatus(ctxt, theDataset, UserNotification.Type.PUBLISHEDDS); } return readyDataset; @@ -219,7 +236,7 @@ private void exportMetadata(Dataset dataset, SettingsServiceBean settingsService ExportService instance = ExportService.getInstance(settingsServiceBean); instance.exportAllFormats(dataset); - } catch (ExportException ex) { + } catch (Exception ex) { // Something went wrong! // Just like with indexing, a failure to export is not a fatal // condition. We'll just log the error as a warning and keep @@ -236,10 +253,24 @@ private void updateParentDataversesSubjectsField(Dataset savedDataset, CommandCo if (dsf.getDatasetFieldType().getName().equals(DatasetFieldConstant.subject)) { Dataverse dv = savedDataset.getOwner(); while (dv != null) { - if (dv.getDataverseSubjects().addAll(dsf.getControlledVocabularyValues())) { + boolean newSubjectsAdded = false; + for (ControlledVocabularyValue cvv : dsf.getControlledVocabularyValues()) { + + if (!dv.getDataverseSubjects().contains(cvv)) { + logger.fine("dv "+dv.getAlias()+" does not have subject "+cvv.getStrValue()); + newSubjectsAdded = true; + dv.getDataverseSubjects().add(cvv); + } else { + logger.fine("dv "+dv.getAlias()+" already has subject "+cvv.getStrValue()); + } + } + if (newSubjectsAdded) { + logger.fine("new dataverse subjects added - saving and reindexing"); Dataverse dvWithSubjectJustAdded = ctxt.em().merge(dv); ctxt.em().flush(); ctxt.index().indexDataverse(dvWithSubjectJustAdded); // need to reindex to capture the new subjects + } else { + logger.fine("no new subjects added to the dataverse; skipping reindexing"); } dv = dv.getOwner(); } @@ -253,6 +284,9 @@ private void validateDataFiles(Dataset dataset, CommandContext ctxt) throws Comm for (DataFile dataFile : dataset.getFiles()) { // TODO: Should we validate all the files in the dataset, or only // the files that haven't been published previously? + // (the decision was made to validate all the files on every + // major release; we can revisit the decision if there's any + // indication that this makes publishing take significantly longer. logger.log(Level.FINE, "validating DataFile {0}", dataFile.getId()); FileUtil.validateDataFileChecksum(dataFile); } @@ -280,24 +314,30 @@ private void validateDataFiles(Dataset dataset, CommandContext ctxt) throws Comm private void publicizeExternalIdentifier(Dataset dataset, CommandContext ctxt) throws CommandException { String protocol = getDataset().getProtocol(); + String authority = getDataset().getAuthority(); GlobalIdServiceBean idServiceBean = GlobalIdServiceBean.getBean(protocol, ctxt); + if (idServiceBean != null) { List args = idServiceBean.getProviderInformation(); try { String currentGlobalIdProtocol = ctxt.settings().getValueForKey(SettingsServiceBean.Key.Protocol, ""); + String currentGlobalAuthority = ctxt.settings().getValueForKey(SettingsServiceBean.Key.Authority, ""); String dataFilePIDFormat = ctxt.settings().getValueForKey(SettingsServiceBean.Key.DataFilePIDFormat, "DEPENDENT"); boolean isFilePIDsEnabled = ctxt.systemConfig().isFilePIDsEnabled(); // We will skip trying to register the global identifiers for datafiles // if "dependent" file-level identifiers are requested, AND the naming - // protocol of the dataset global id is different from the - // one currently configured for the Dataverse. This is to specifically - // address the issue with the datasets with handle ids registered, - // that are currently configured to use DOI. - // ... + // protocol, or the authority of the dataset global id is different from + // what's currently configured for the Dataverse. In other words + // we can't get "dependent" DOIs assigned to files in a dataset + // with the registered id that is a handle; or even a DOI, but in + // an authority that's different from what's currently configured. // Additionaly in 4.9.3 we have added a system variable to disable // registering file PIDs on the installation level. - if ((currentGlobalIdProtocol.equals(protocol) || dataFilePIDFormat.equals("INDEPENDENT"))//TODO(pm) - check authority too - && isFilePIDsEnabled) { + if (((currentGlobalIdProtocol.equals(protocol) && currentGlobalAuthority.equals(authority)) + || dataFilePIDFormat.equals("INDEPENDENT")) + && isFilePIDsEnabled + && dataset.getLatestVersion().getMinorVersionNumber() != null + && dataset.getLatestVersion().getMinorVersionNumber().equals((long) 0)) { //A false return value indicates a failure in calling the service for (DataFile df : dataset.getFiles()) { logger.log(Level.FINE, "registering global id for file {0}", df.getId()); @@ -315,14 +355,13 @@ private void publicizeExternalIdentifier(Dataset dataset, CommandContext ctxt) t dataset.setGlobalIdCreateTime(new Date()); // TODO these two methods should be in the responsibility of the idServiceBean. dataset.setIdentifierRegistered(true); } catch (Throwable e) { + // Send failure notification to the user: + notifyUsersDatasetPublishStatus(ctxt, dataset, UserNotification.Type.PUBLISHFAILED_PIDREG); + ctxt.datasets().removeDatasetLocks(dataset, DatasetLock.Reason.finalizePublication); throw new CommandException(BundleUtil.getStringFromBundle("dataset.publish.error", args), this); } } - /* - * for debugging only: (TODO: remove before making the final PR) - throw new CommandException(BundleUtil.getStringFromBundle("dataset.publish.error", idServiceBean.getProviderInformation()), this); - */ } private void updateFiles(Timestamp updateTime, CommandContext ctxt) throws CommandException { @@ -401,13 +440,14 @@ private void notifyUsersFileDownload(CommandContext ctxt, DvObject subject) { .forEach( au -> ctxt.notifications().sendNotification(au, getTimestamp(), UserNotification.Type.GRANTFILEACCESS, getDataset().getId()) ); } - private void notifyUsersDatasetPublish(CommandContext ctxt, DvObject subject) { + private void notifyUsersDatasetPublishStatus(CommandContext ctxt, DvObject subject, UserNotification.Type type) { + ctxt.roles().rolesAssignments(subject).stream() .filter( ra -> ra.getRole().permissions().contains(Permission.ViewUnpublishedDataset) || ra.getRole().permissions().contains(Permission.DownloadFile)) .flatMap( ra -> ctxt.roleAssignees().getExplicitUsers(ctxt.roleAssignees().getRoleAssignee(ra.getAssigneeIdentifier())).stream() ) .distinct() // prevent double-send //.forEach( au -> ctxt.notifications().sendNotification(au, timestamp, messageType, theDataset.getId()) ); //not sure why this line doesn't work instead - .forEach( au -> ctxt.notifications().sendNotification(au, getTimestamp(), UserNotification.Type.PUBLISHEDDS, getDataset().getLatestVersion().getId()) ); + .forEach( au -> ctxt.notifications().sendNotificationInNewTransaction(au, getTimestamp(), type, getDataset().getLatestVersion().getId()) ); } } diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/PublishDatasetCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/PublishDatasetCommand.java index 84cb5436205..7d87c95c2d0 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/PublishDatasetCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/PublishDatasetCommand.java @@ -21,7 +21,6 @@ import java.util.Optional; import java.util.logging.Logger; import static java.util.stream.Collectors.joining; -import static java.util.stream.Collectors.joining; /** * Kick-off a dataset publication process. The process may complete immediately, @@ -69,16 +68,6 @@ public PublishDatasetResult execute(CommandContext ctxt) throws CommandException Dataset theDataset = getDataset(); - // If PID can be reserved, only allow publishing if it is. - String protocol = getDataset().getProtocol(); - GlobalIdServiceBean idServiceBean = GlobalIdServiceBean.getBean(protocol, ctxt); - boolean reservingPidsSupported = !idServiceBean.registerWhenPublished(); - if (reservingPidsSupported) { - if (theDataset.getGlobalIdCreateTime() == null) { - throw new IllegalCommandException(BundleUtil.getStringFromBundle("publishDatasetCommand.pidNotReserved"), this); - } - } - // Set the version numbers: if (theDataset.getPublicationDate() == null) { @@ -130,28 +119,33 @@ public PublishDatasetResult execute(CommandContext ctxt) throws CommandException boolean validatePhysicalFiles = ctxt.systemConfig().isDatafileValidationOnPublishEnabled(); - if ((registerGlobalIdsForFiles || validatePhysicalFiles) - && theDataset.getFiles().size() > ctxt.systemConfig().getPIDAsynchRegFileCount()) { - // TODO? The time it takes to validate the physical files in the dataset - // is a function of the total file size, NOT the number of files; - // so that's what we should be checking. - String info = registerGlobalIdsForFiles ? "Registering PIDs for Datafiles and " : ""; - info += "Validating Datafiles Asynchronously"; - AuthenticatedUser user = request.getAuthenticatedUser(); - - DatasetLock lock = new DatasetLock(DatasetLock.Reason.finalizePublication, user); - lock.setDataset(theDataset); - lock.setInfo(info); - ctxt.datasets().addDatasetLock(theDataset, lock); - theDataset = ctxt.em().merge(theDataset); - ctxt.datasets().callFinalizePublishCommandAsynchronously(theDataset.getId(), ctxt, request, datasetExternallyReleased); - return new PublishDatasetResult(theDataset, false); + // As of v5.0, publishing a dataset is always done asynchronously, + // with the dataset locked for the duration of the operation. + + //if ((registerGlobalIdsForFiles || validatePhysicalFiles) + // && theDataset.getFiles().size() > ctxt.systemConfig().getPIDAsynchRegFileCount()) { + String info = "Publishing the dataset; "; + info += registerGlobalIdsForFiles ? "Registering PIDs for Datafiles; " : ""; + info += validatePhysicalFiles ? "Validating Datafiles Asynchronously" : ""; + + AuthenticatedUser user = request.getAuthenticatedUser(); + DatasetLock lock = new DatasetLock(DatasetLock.Reason.finalizePublication, user); + lock.setDataset(theDataset); + lock.setInfo(info); + ctxt.datasets().addDatasetLock(theDataset, lock); + theDataset = ctxt.em().merge(theDataset); + ctxt.datasets().callFinalizePublishCommandAsynchronously(theDataset.getId(), ctxt, request, datasetExternallyReleased); + return new PublishDatasetResult(theDataset, false); + + /** + * Code for for "synchronous" (while-you-wait) publishing + * is preserved below, commented out: } else { // Synchronous publishing (no workflow involved) theDataset = ctxt.engine().submit(new FinalizeDatasetPublicationCommand(theDataset, getRequest(),datasetExternallyReleased)); return new PublishDatasetResult(theDataset, true); - } + } */ } } diff --git a/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java b/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java index 04ec3056752..dd797a55539 100644 --- a/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java @@ -328,7 +328,7 @@ private static void writeDocDescElement (XMLStreamWriter xmlw, DatasetDTO datase private static void writeVersionStatement(XMLStreamWriter xmlw, DatasetVersionDTO datasetVersionDTO) throws XMLStreamException{ xmlw.writeStartElement("verStmt"); - writeAttribute(xmlw,"source","DVN"); + writeAttribute(xmlw,"source","archive"); xmlw.writeStartElement("version"); writeAttribute(xmlw,"date", datasetVersionDTO.getReleaseTime().substring(0, 10)); writeAttribute(xmlw,"type", datasetVersionDTO.getVersionState().toString()); diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java index 63b9fc1799f..9aea49357a7 100644 --- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java +++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java @@ -6,6 +6,7 @@ import com.lyncode.xoai.dataprovider.handlers.*; import com.lyncode.xoai.exceptions.InvalidResumptionTokenException; import com.lyncode.xoai.dataprovider.model.Context; +import com.lyncode.xoai.model.oaipmh.Identify; import com.lyncode.xoai.model.oaipmh.OAIPMH; import com.lyncode.xoai.model.oaipmh.Request; import com.lyncode.xoai.dataprovider.parameters.OAICompiledRequest; @@ -76,7 +77,9 @@ public OAIPMH handle (OAIRequest requestParameters) throws OAIException { switch (request.getVerbType()) { case Identify: - response.withVerb(identifyHandler.handle(parameters)); + Identify identify = identifyHandler.handle(parameters); + identify.getDescriptions().clear(); // We don't want to use the default description + response.withVerb(identify); break; case ListSets: response.withVerb(listSetsHandler.handle(parameters)); diff --git a/src/main/java/edu/harvard/iq/dataverse/ingest/IngestUtil.java b/src/main/java/edu/harvard/iq/dataverse/ingest/IngestUtil.java index 150d6cfb43c..7f01e217cfa 100644 --- a/src/main/java/edu/harvard/iq/dataverse/ingest/IngestUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/ingest/IngestUtil.java @@ -133,14 +133,23 @@ public static boolean conflictsWithExistingFilenames(String pathPlusFilename, Li } /** - * Given a DatasetVersion, iterate across all the files (including their + * Given a DatasetVersion, and the newFiles about to be added to the + * version iterate across all the files (including their * paths) and return any duplicates. * * @param datasetVersion + * @param newFiles * @return A Collection of Strings in the form of path/to/file.txt */ - public static Collection findDuplicateFilenames(DatasetVersion datasetVersion) { - List allFileNamesWithPaths = getPathsAndFileNames(datasetVersion.getFileMetadatas()); + public static Collection findDuplicateFilenames(DatasetVersion datasetVersion, List newFiles) { + List toTest = new ArrayList(); + datasetVersion.getFileMetadatas().forEach((fm) -> { + toTest.add(fm); + }); + newFiles.forEach((df) -> { + toTest.add(df.getFileMetadata()); + }); + List allFileNamesWithPaths = getPathsAndFileNames(toTest); return findDuplicates(allFileNamesWithPaths); } diff --git a/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java b/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java index 59dca5bb2da..608566bcae1 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java @@ -1836,5 +1836,29 @@ public static void deleteTempFile(DataFile dataFile, Dataset dataset, IngestServ dataFile.setOwner(null); } } + + public static boolean isFileAlreadyUploaded(DataFile dataFile, Map checksumMapNew, Map fileAlreadyExists) { + if (checksumMapNew == null) { + checksumMapNew = new HashMap<>(); + } + + if (fileAlreadyExists == null) { + fileAlreadyExists = new HashMap<>(); + } + + String chksum = dataFile.getChecksumValue(); + + if (chksum == null) { + return false; + } + + if (checksumMapNew.get(chksum) != null) { + fileAlreadyExists.put(dataFile, checksumMapNew.get(chksum)); + return true; + } + + checksumMapNew.put(chksum, dataFile); + return false; + } } diff --git a/src/main/java/edu/harvard/iq/dataverse/util/MailUtil.java b/src/main/java/edu/harvard/iq/dataverse/util/MailUtil.java index 7a144e65ee1..88f0b9a61c5 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/MailUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/MailUtil.java @@ -54,6 +54,8 @@ public static String getSubjectTextBasedOnNotification(UserNotification userNoti return BundleUtil.getStringFromBundle("notification.email.submit.dataset.subject", rootDvNameAsList); case PUBLISHEDDS: return BundleUtil.getStringFromBundle("notification.email.publish.dataset.subject", rootDvNameAsList); + case PUBLISHFAILED_PIDREG: + return BundleUtil.getStringFromBundle("notification.email.publishFailure.dataset.subject", rootDvNameAsList); case RETURNEDDS: return BundleUtil.getStringFromBundle("notification.email.returned.dataset.subject", rootDvNameAsList); case CREATEACC: diff --git a/src/main/java/edu/harvard/iq/dataverse/util/SessionUtil.java b/src/main/java/edu/harvard/iq/dataverse/util/SessionUtil.java new file mode 100644 index 00000000000..0539ea40cb8 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/util/SessionUtil.java @@ -0,0 +1,32 @@ +package edu.harvard.iq.dataverse.util; + +import java.util.Enumeration; +import java.util.HashMap; +import java.util.Map.Entry; + +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpSession; + +public class SessionUtil { + + /** + * Changes the session id (jsessionId) - for use when the session's authority increases (i.e. at login) + * Servlet 3.1 Note: This method is needed while using Servlets 2.0. 3.1 has a HttpServletRequest.chageSessionId(); method that can be used instead. + * + * @param h the current HttpServletRequest + * e.g. for pages you can get this from (HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest(); + */ + public static void changeSessionId(HttpServletRequest h) { + HttpSession session = h.getSession(false); + HashMap sessionAttributes = new HashMap(); + for(Enumeration e = session.getAttributeNames();e.hasMoreElements();) { + String name = e.nextElement(); + sessionAttributes.put(name, session.getAttribute(name)); + } + h.getSession().invalidate(); + session = h.getSession(true); + for(Entry entry: sessionAttributes.entrySet()) { + session.setAttribute(entry.getKey(), entry.getValue()); + } + } +} diff --git a/src/main/java/propertyFiles/Bundle.properties b/src/main/java/propertyFiles/Bundle.properties index ba6a5738796..8c70475953c 100755 --- a/src/main/java/propertyFiles/Bundle.properties +++ b/src/main/java/propertyFiles/Bundle.properties @@ -188,6 +188,7 @@ notification.dataset.management.title=Dataset Management - Dataset User Guide notification.wasSubmittedForReview={0} was submitted for review to be published in {1}. Don''t forget to publish it or send it back to the contributor, {2} ({3})\! notification.wasReturnedByReviewer={0} was returned by the curator of {1}. notification.wasPublished={0} was published in {1}. +notification.publishFailedPidReg={0} in {1} could not be published due to a failure to register, or update the Global Identifier for the dataset or one of the files in it. Contact support if this continues to happen. notification.ingestCompleted=Dataset {1} ingest has successfully finished. notification.ingestCompletedWithErrors=Dataset {1} ingest has finished with errors. notification.worldMap.added={0}, dataset had WorldMap layer data added to it. @@ -645,6 +646,7 @@ notification.email.maplayer.deletefailed.subject={0}: Failed to delete WorldMap notification.email.maplayer.deletefailed.text=We failed to delete the WorldMap layer associated with the restricted file {0}, and any related data that may still be publicly available on the WorldMap site. Please try again, or contact WorldMap and/or Dataverse support. (Dataset: {1}) notification.email.submit.dataset.subject={0}: Your dataset has been submitted for review notification.email.publish.dataset.subject={0}: Your dataset has been published +notification.email.publishFailure.dataset.subject={0}: Failed to publish your dataset notification.email.returned.dataset.subject={0}: Your dataset has been returned notification.email.create.account.subject={0}: Your account has been created notification.email.assign.role.subject={0}: You have been assigned a role @@ -666,6 +668,7 @@ notification.email.createDataset=Your new dataset named {0} (view at {1} ) was c notification.email.wasSubmittedForReview={0} (view at {1}) was submitted for review to be published in {2} (view at {3}). Don''t forget to publish it or send it back to the contributor, {4} ({5})\! notification.email.wasReturnedByReviewer={0} (view at {1}) was returned by the curator of {2} (view at {3}). notification.email.wasPublished={0} (view at {1}) was published in {2} (view at {3}). +notification.email.publishFailedPidReg={0} (view at {1}) in {2} (view at {3}) could not be published due to a failure to register, or update the Global Identifier for the dataset or one of the files in it. Contact support if this continues to happen. notification.email.worldMap.added={0} (view at {1}) had WorldMap layer data added to it. notification.email.closing=\n\nYou may contact us for support at {0}.\n\nThank you,\n{1} notification.email.assignRole=You are now {0} for the {1} "{2}" (view at {3}). @@ -1322,6 +1325,8 @@ dataset.locked.ingest.message=The tabular data files uploaded are being processe dataset.unlocked.ingest.message=The tabular files have been ingested. dataset.locked.editInProgress.message=Edit In Progress dataset.locked.editInProgress.message.details=Additional edits cannot be made at this time. Contact {0} if this status persists. +dataset.locked.pidNotReserved.message=Dataset DOI Not Reserved +dataset.locked.pidNotReserved.message.details=The DOI displayed in the citation for this dataset has not yet been reserved with DataCite. Please do not share this DOI until it has been reserved. dataset.publish.error=This dataset may not be published due to an error when contacting the {0} Service. Please try again. dataset.publish.error.doi=This dataset may not be published because the DOI update failed. dataset.publish.file.validation.error.message=Failed to Publish Dataset @@ -1346,7 +1351,7 @@ dataset.delete.error=Could not deaccession the dataset because the {0} update fa dataset.publish.worldMap.deleteConfirm=Please note that your data and map on WorldMap will be removed due to restricted file access changes in this dataset version which you are publishing. Do you want to continue? dataset.publish.workflow.message=Publish in Progress dataset.publish.workflow.inprogress=This dataset is locked until publication. -dataset.pidRegister.workflow.inprogress=This dataset is locked while the persistent identifiers for the datafiles are being registered and/or physical files are validated. +dataset.pidRegister.workflow.inprogress=The dataset is locked while the persistent identifiers are being registered or updated, and/or the physical files are being validated. dataset.versionUI.draft=Draft dataset.versionUI.inReview=In Review dataset.versionUI.unpublished=Unpublished @@ -1507,9 +1512,13 @@ file.replaced.warning.draft.warningMessage=You can not replace a file that has b file.replaced.warning.previous.warningMessage=You can not edit a file that has been replaced in a previous dataset version. In order to edit it you must go to the most recently published version of the file. file.alreadyDeleted.previous.warningMessage=This file has already been deleted in current version. It may not be edited. file.delete=Delete +file.delete.duplicate.multiple=Delete Duplicate Files +file.delete.duplicate.single=Delete Duplicate File file.metadata=Metadata file.deleted.success=Files "{0}" will be permanently deleted from this version of this dataset once you click on the Save Changes button. file.deleted.replacement.success=The replacement file has been deleted. +file.deleted.upload.success.single=File has been deleted and won\u2019t be included in this upload. +file.deleted.upload.success.multiple=Files have been deleted and won\u2019t be included in this upload. file.editAccess=Edit Access file.restrict=Restrict file.unrestrict=Unrestrict @@ -1913,15 +1922,18 @@ file.addreplace.error.no_edit_dataset_permission=You do not have permission to e file.addreplace.error.filename_undetermined=The file name cannot be determined. file.addreplace.error.file_content_type_undetermined=The file content type cannot be determined. file.addreplace.error.file_upload_failed=The file upload failed. -file.addreplace.error.duplicate_file=This file already exists in the dataset. +file.addreplace.warning.duplicate_file=This file has the same content as {0} that is in the dataset. +file.addreplace.error.duplicate_file.continue=You may delete if it was not intentional. file.addreplace.error.existing_file_to_replace_id_is_null=The ID of the existing file to replace must be provided. file.addreplace.error.existing_file_to_replace_not_found_by_id=Replacement file not found. There was no file found for ID: {0} file.addreplace.error.existing_file_to_replace_is_null=The file to replace cannot be null. file.addreplace.error.existing_file_to_replace_not_in_dataset=The file to replace does not belong to this dataset. file.addreplace.error.existing_file_not_in_latest_published_version=You cannot replace a file that is not in the most recently published dataset. (The file is unpublished or was deleted from a previous version.) file.addreplace.content_type.header=File Type Different +file.addreplace.already_exists.header=Duplicate File Uploaded +file.addreplace.already_exists.header.multiple=Duplicate Files Uploaded file.addreplace.error.replace.new_file_has_different_content_type=The original file ({0}) and replacement file ({1}) are different file types. -file.addreplace.error.replace.new_file_same_as_replacement=You cannot replace a file with the exact same file. +file.addreplace.error.replace.new_file_same_as_replacement=Error! You may not replace a file with a file that has duplicate content. file.addreplace.error.unpublished_file_cannot_be_replaced=You cannot replace an unpublished file. Please delete it instead of replacing it. file.addreplace.error.ingest_create_file_err=There was an error when trying to add the new file. file.addreplace.error.initial_file_list_empty=An error occurred and the new file was not added. @@ -1936,6 +1948,8 @@ file.addreplace.success.replace=File successfully replaced! file.addreplace.error.auth=The API key is invalid. file.addreplace.error.invalid_datafile_tag=Not a valid Tabular Data Tag: + + # 500.xhtml error.500.page.title=500 Internal Server Error error.500.message=Internal Server Error - An unexpected error was encountered, no more information is available. @@ -2104,16 +2118,19 @@ dataverse.alias.taken=This Alias is already taken. #editDatafilesPage.java dataset.save.fail=Dataset Save Failed -dataset.files.exist=The following files already exist in the dataset: -dataset.file.exist=The following file already exists in the dataset: -dataset.files.duplicate=The following files are duplicates of (an) already uploaded file(s): -dataset.file.duplicate=The following file is a duplicate of an already uploaded file: -dataset.file.skip=(skipping) + +dataset.files.exist=Files {0} have the same content as {1} that already exists in the dataset. +dataset.file.exist=File {0} has the same content as {1} that already exists in the dataset. +dataset.file.exist.test={0, choice, 1#File |2#Files |} {1} {0, choice, 1#has |2#have |} the same content as {2} that already {0, choice, 1#exist |2#exist |}in the dataset. +dataset.files.duplicate=Files {0} have the same content as {1} that have already been uploaded. +dataset.file.duplicate=File {0} has the same content as {1} that has already been uploaded. +dataset.file.inline.message= This file has the same content as {0}. dataset.file.upload=Succesful {0} is uploaded. dataset.file.uploadFailure=upload failure dataset.file.uploadFailure.detailmsg=the file {0} failed to upload! dataset.file.uploadWarning=upload warning dataset.file.uploadWorked=upload worked +dataset.file.upload.popup.explanation.tip=For more information, please refer to the Duplicate Files section of the User Guide. #EmailValidator.java email.invalid=is not a valid email address. diff --git a/src/main/webapp/dataverseuser.xhtml b/src/main/webapp/dataverseuser.xhtml index 19cf0603ceb..76bfd9bf2b1 100644 --- a/src/main/webapp/dataverseuser.xhtml +++ b/src/main/webapp/dataverseuser.xhtml @@ -140,6 +140,17 @@ + + #{item.theObject.getDataset().getDisplayName()} + + + #{item.theObject.getDataset().getOwner().getDisplayName()} + + + + + + #{item.theObject.getDataset().getDisplayName()} diff --git a/src/main/webapp/editFilesFragment.xhtml b/src/main/webapp/editFilesFragment.xhtml index a1ec27859cf..9a516325b2f 100644 --- a/src/main/webapp/editFilesFragment.xhtml +++ b/src/main/webapp/editFilesFragment.xhtml @@ -283,8 +283,8 @@