Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API cleanup #3358

Closed
michbarsinai opened this issue Sep 15, 2016 · 26 comments
Closed

API cleanup #3358

michbarsinai opened this issue Sep 15, 2016 · 26 comments

Comments

@michbarsinai
Copy link
Member

michbarsinai commented Sep 15, 2016

While working towards JavaOne'16, where we present our API implementation, I found various inconsistencies. This issue is used for tracking their fixes.

@michbarsinai
Copy link
Member Author

d813105: Should be "Illegal command translates to 403 FORBIDDEN.

@djbrooke
Copy link
Contributor

Heard from @michbarsinai that this one is ready for code review!

#3381 is the PR.

@pdurbin
Copy link
Member

pdurbin commented Sep 26, 2016

When I run mvn test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT on develop (ecde099) I get "Tests run: 22, Failures: 0, Errors: 0, Skipped: 0".

However, when I run the same tests on the 3358-api-cleanup branch (c83f555), I'm getting "Tests run: 23, Failures: 3, Errors: 0, Skipped: 0".

Here is more of the failure output: Failed tests: testPrivateUrl(edu.harvard.iq.dataverse.api.DatasetsIT): expected:<400> but was:<403> testAttemptToCreateDuplicateAlias(edu.harvard.iq.dataverse.api.DataversesIT): expected:<400> but was:<403>

If expected response codes are changing, I would expect to see this noted in the API Guide at least (and the tests above should be adjusted so they continue to pass). We could also consider updating the version of the API from "1" to "2" or whatever (however, see #3325 for challenges in bumping the API version).

Please note that the list of tests above is not arbitrary. It's the list of tests that are executed at https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/ when after I kick off a build once a pull request has been merged into develop. See also #1936 (comment) .

At https://waffle.io/IQSS/dataverse I'm putting this issue and pull request #3381 back in Development and assigning to @michbarsinai for consideration.

@pdurbin
Copy link
Member

pdurbin commented Sep 28, 2016

The status of this issue is that I'm asking @michbarsinai to make sure DatasetsIT passes and he is having to mess around with JVM options. I recommend the following:

$ ./asadmin list-jvm-options | grep siteUrl
-Ddataverse.siteUrl=http://localhost:8080

(Perhaps this should be documented in the Deveoper Guide.)

Also, I believe @michbarsinai has agreed that status codes changing at least warrants a mention in the release notes. @kcondon indicated he needs to be given a heads up of what should be written in the release notes. I favor having the change be documented in the API Guide.

@pdurbin
Copy link
Member

pdurbin commented Sep 28, 2016

@michbarsinai maybe this scope creep but #2219, #2431 and #2461 could be at least considered as well for API cleanup.

@michbarsinai
Copy link
Member Author

Current status: DatasetsIT test working, these tests fail:

Failed tests:   testServiceDocument(edu.harvard.iq.dataverse.api.SwordIT)
  testCreateAndDeleteDatasetInRoot(edu.harvard.iq.dataverse.api.SwordIT): Expected status code <201> doesn't match actual status code <400>.(..)

I'm pretty sure that's a setup issue at this point. Maybe run these on a pre-setup machine, or better yet, script it in a Docker container or something.

@michbarsinai
Copy link
Member Author

@pdurbin - can you please run the test suite on a properly setup machine so we can green the PR?

pdurbin added a commit that referenced this issue Sep 29, 2016
@pdurbin
Copy link
Member

pdurbin commented Sep 29, 2016

The stupid EJB timer prevented me from running my normal drop-database-and-get-set up again scripts that I've talked about #2443 at so I added a fix in 9b5f7af, which is the same as dc0af34 for #3336.

Anyway, yes, the tests pass now. mvn test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT yields "Tests run: 22, Failures: 0, Errors: 0, Skipped: 0... BUILD SUCCESS". Thanks!

@djbrooke djbrooke added this to the 4.6 - File Replace milestone Sep 30, 2016
@pdurbin
Copy link
Member

pdurbin commented Oct 3, 2016

@michbarsinai I reviewed https://github.com/IQSS/dataverse/pull/3381/files as of 9e06128 and I was a little surprised by how much of an extensive rewrite this is. As long as you haven't introduced any typos anywhere I suppose the API should continue to "just work" but @kcondon would probably appreciate a list from you of what changed so he knows what to test. I'll go ahead and pass this to QA at https://waffle.io/IQSS/dataverse so you two can figure out what the testing strategy is.

@pdurbin pdurbin assigned kcondon and unassigned pdurbin Oct 3, 2016
@pdurbin
Copy link
Member

pdurbin commented Oct 5, 2016

@michbarsinai any thoughts on rolling a fix for #3209 into your pull request?

@michbarsinai
Copy link
Member Author

Sure, feel free to roll it in (I'm still traveling from the conference, might take some time to get to it).

@pdurbin
Copy link
Member

pdurbin commented Oct 6, 2016

@michbarsinai I might, but let's discuss the details in the other issue. I just left a comment there: #3209 (comment)

@kcondon
Copy link
Contributor

kcondon commented Oct 19, 2016

From Michael:

Changes:
1/ #3209 - putting the test endpoint under admin. This includes updates to the guides (config.rst, where we tell installers what endpoints to block).
2/ Try to perform an action you're not authorized, and see that you get a 403 FORBIDDEN rather than 400 BAD REQUEST
3/ Following Endpoints were changed, but using automated refactoring. So no change should be visible, but you can do some smoke tests if you want.

  • admin/*
  • admin/datasetfield/*
  • admin/groups/*
  • admin/index/*
  • admin/savedsearches/*
  • batch/*
  • metadatablocks/*
  • info/settings/:DatasetPublishPopupCustomText (that's the only endpoint there)
  • builtin-users/*
  • datasets/*
  • roles/*
  • search/*
  • worldmap/*
  • dataverses/*
  • harvest/server/oaisets - Also, harvesting client.
    4/ Endpoints that had a manual code change, but should act the same:
  • GET admin/authenticationProviderFactories
  • GET admin/authenticationProviders
  • GET datasets/{id}
  • GET datasets/{id}/assignments
  • GET datasets/{id}/versions
  • GET datasets/{id}/versions/{version-id}
  • GET datasets/{id}/versions/{version-id}/files
  • DELETE datasets/{id}
  • DELETE datasets/{id}/destroy
  • GET dataverses/{id}
  • GET dataverses/{id}/metadatablocks
  • GET dataverses/{id}/roles
  • DELETE dataverses/{id}/groups/{aliasInOwner}/roleAssignees/{roleAssigneeIdentifier}
  • GET admin/groups/ip

@pdurbin
Copy link
Member

pdurbin commented Oct 19, 2016

Here is the latest result from REST Assured testing: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/edu.harvard.iq$dataverse/56/testReport/edu.harvard.iq.dataverse.api/ . @kcondon and I looked at console output too and generally poked around. There are a lot of assumptions in the REST Assured tests. They expect the "burrito" key to be there so you can create users, for example. I'm happy to document all of them in the Dev Guide some day.

@michbarsinai
Copy link
Member Author

@pdurbin we all expect a burrito!

On 19 Oct 2016, at 23:19, Philip Durbin notifications@github.com wrote:

Here is the latest result from REST Assured testing: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/edu.harvard.iq$dataverse/56/testReport/edu.harvard.iq.dataverse.api/ https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/edu.harvard.iq%24dataverse/56/testReport/edu.harvard.iq.dataverse.api/ . @kcondon https://github.com/kcondon and I looked at console output too and generally poked around. There are a lot of assumptions in the REST Assured tests. They expect the "burrito" key to be there so you can create users, for example. I'm happy to document all of them in the Dev Guide some day.


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub #3358 (comment), or mute the thread https://github.com/notifications/unsubscribe-auth/AB2UJKp0j1m7-hzy_1x0csMVSFFefnBdks5q1ns2gaJpZM4J9ioV.

@pdurbin
Copy link
Member

pdurbin commented Oct 21, 2016

@kcondon this morning I mentioned a "gotcha" that the REST Assured test suite has an assumption that ordinary users can create both datasets and dataverses in the root dataverse. On http://phoenix.dataverse.org I run https://github.com/IQSS/dataverse/blob/v4.5.1/scripts/search/tests/grant-authusers-add-on-root to set these permissions. It passes in the following to the root dataverse:

{
  "assignee": ":authenticated-users",
  "role": "fullContributor"
}

(The REST Assured tests weren't working on @raprasad 's laptop for the same reason. #2443 is related.)

Anyway, I hope this helps. Post-install setup of phoenix is all here: https://github.com/IQSS/dataverse/blob/v4.5.1/scripts/deploy/phoenix.dataverse.org/post . The only other important thing is that the root dataverse needs to be published for the REST Assured test suite to run.

@kcondon
Copy link
Contributor

kcondon commented Oct 24, 2016

Verified 1. moving test endpoint, 2. response code from 403 to 400 for perms issue using Phil's integration test, 3. by a combination of code review to confirm systemic nature of changes and spot checking calls using existing scripts (setup-all) and phil's integration test. Now will look at the last set of changes for coverage.

@kcondon
Copy link
Contributor

kcondon commented Oct 24, 2016

I've asked Phil to review the test coverage for his integration tests and update a spreadsheet accordingly so we can better assess future work and asked Michael about RESTful api self discovery feature to make understanding the scope of an API easier. This is primarily future work though.

@kcondon
Copy link
Contributor

kcondon commented Oct 24, 2016

OK found 1 regression: #3425

raprasad added a commit that referenced this issue Oct 24, 2016
…ve status http error codes to add/replace api calls
@kcondon
Copy link
Contributor

kcondon commented Oct 24, 2016

OK aside from above regression and an inability to get one endpoint to work at all:
DELETE dataverses/{id}/groups/{aliasInOwner}/roleAssignees/{roleAssigneeIdentifier}

Everything looks ready to merge. Following up with Michael on the above endpoint.

@kcondon kcondon closed this as completed Oct 24, 2016
@pdurbin
Copy link
Member

pdurbin commented Oct 26, 2016

I've asked Phil to review the test coverage for his integration tests and update a spreadsheet accordingly

@kcondon I took a quick look at the spreadsheet (1AA-BGtyQywbi8whmnBQnQGJCHU_rpPURNPXq4ttTToE so I can find it again) but too late, I suppose, since #3381 has already been merged. Unfortunately, I haven't been able to figure out a good way to measure code coverage for REST Assured tests (unit test coverage is shown at https://coveralls.io/github/IQSS/dataverse ). Please see #2746 and how I wrote "Generate code coverage reports for integration tests: pkainulainen/maven-examples#3 and http://www.petrikainulainen.net/programming/maven/creating-code-coverage-reports-for-unit-and-integration-tests-with-the-jacoco-maven-plugin/ ". It's an unsolved problem.

@kcondon
Copy link
Contributor

kcondon commented Oct 26, 2016

@pdurbin Just a quick eyeball of which api endpoints are called by your integration tests is what I'm looking for. A more comprehensive, automated system down the line would be welcomed.

@pdurbin
Copy link
Member

pdurbin commented Oct 26, 2016

@kcondon I just spent 10 minutes adding a "Phil's integration tests as of pull request 3381 being merged (Phil's take)" column to the spreadsheet. I hope it helps!

@kcondon
Copy link
Contributor

kcondon commented Oct 26, 2016

@pdurbin Much better, thanks. Would be good to log specific api calls as part of test output and confirm in code those you've marked as "unsure". From your comment I know you are looking at automated code coverage reports but there is also room/ need for top-down checklist style coverage, such as the exercise we just went through for this ticket. Thanks again for the effort.

@pdurbin
Copy link
Member

pdurbin commented Oct 26, 2016

@kcondon yeah, a top-down checklist is a good idea. For now, I added it to the list of ideas in #2746 under "documentation".

@kcondon
Copy link
Contributor

kcondon commented Oct 27, 2016

@pdurbin Thanks, I also added to #2746 the bit about logging the endpoints in test output. I know it's a work in progress but hopefully we can understand the scope of any test and its behavior through test output.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants