Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[APM] Set preference to any value for APM searches #110480

Merged

Conversation

MiriamAparicio
Copy link
Contributor

@MiriamAparicio MiriamAparicio commented Aug 30, 2021

Summary

Closes #102725

If the cluster state and selected shards do not change, searches using the same value are routed to the same shards in the same order, improving performance of the search requests.

What was done

  • Add preference param to the search request wrapper
  • Fix unit tests

Checklist

@MiriamAparicio MiriamAparicio requested a review from a team as a code owner August 30, 2021 14:09
@botelastic botelastic bot added the Team:APM All issues that need APM UI Team support label Aug 30, 2021
@elasticmachine
Copy link
Contributor

Pinging @elastic/apm-ui (Team:apm)

@MiriamAparicio MiriamAparicio added v7.16.0 apm:performance APM UI - Performance Work release_note:skip Skip the PR/issue when compiling release notes labels Aug 30, 2021
Copy link
Member

@sorenlouv sorenlouv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@MiriamAparicio MiriamAparicio merged commit c568a43 into elastic:master Aug 31, 2021
@MiriamAparicio MiriamAparicio deleted the set-preference-for-apm-searches branch August 31, 2021 13:12
@sorenlouv
Copy link
Member

@MiriamAparicio I see #102725 is still open. Is that intentional? If not, in the future you can change:

This PR addresses the issues from #102725

To

Closes #102725

which will automatically close the issue.

@sorenlouv
Copy link
Member

@MiriamAparicio One more thing: you can add auto-backport label, and this change will automatically be backported to the selected branches (in this case just 7.16)

@MiriamAparicio MiriamAparicio added the auto-backport Deprecated - use backport:version if exact versions are needed label Aug 31, 2021
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Aug 31, 2021
@kibanamachine
Copy link
Contributor

💚 Backport successful

Status Branch Result
7.x

This backport PR will be merged automatically after passing CI.

@kibanamachine
Copy link
Contributor

kibanamachine commented Aug 31, 2021

💔 Build Failed

Failed CI Steps


Test Failures

Kibana Pipeline / general / Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/anomaly_detection/advanced_job·ts.machine learning anomaly detection advanced job "before all" hook in "advanced job"

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: machine learning
[00:00:00]           └-> "before all" hook in "machine learning"
[00:00:00]           └-: 
[00:00:00]             └-> "before all" hook in ""
[00:00:00]             └-> "before all" hook in ""
[00:00:00]               │ debg creating role ft_ml_source
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_source]
[00:00:00]               │ debg creating role ft_ml_source_readonly
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_source_readonly]
[00:00:00]               │ debg creating role ft_ml_dest
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_dest]
[00:00:00]               │ debg creating role ft_ml_dest_readonly
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_dest_readonly]
[00:00:00]               │ debg creating role ft_ml_ui_extras
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_ui_extras]
[00:00:00]               │ debg creating role ft_default_space_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space_ml_all]
[00:00:00]               │ debg creating role ft_default_space1_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space1_ml_all]
[00:00:00]               │ debg creating role ft_all_spaces_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_all_spaces_ml_all]
[00:00:00]               │ debg creating role ft_default_space_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space_ml_read]
[00:00:00]               │ debg creating role ft_default_space1_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space1_ml_read]
[00:00:00]               │ debg creating role ft_all_spaces_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_all_spaces_ml_read]
[00:00:00]               │ debg creating role ft_default_space_ml_none
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space_ml_none]
[00:00:00]               │ debg creating user ft_ml_poweruser
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser]
[00:00:00]               │ debg created user ft_ml_poweruser
[00:00:00]               │ debg creating user ft_ml_poweruser_spaces
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser_spaces]
[00:00:00]               │ debg created user ft_ml_poweruser_spaces
[00:00:00]               │ debg creating user ft_ml_poweruser_space1
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser_space1]
[00:00:00]               │ debg created user ft_ml_poweruser_space1
[00:00:00]               │ debg creating user ft_ml_poweruser_all_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser_all_spaces]
[00:00:01]               │ debg created user ft_ml_poweruser_all_spaces
[00:00:01]               │ debg creating user ft_ml_viewer
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer]
[00:00:01]               │ debg created user ft_ml_viewer
[00:00:01]               │ debg creating user ft_ml_viewer_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer_spaces]
[00:00:01]               │ debg created user ft_ml_viewer_spaces
[00:00:01]               │ debg creating user ft_ml_viewer_space1
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer_space1]
[00:00:01]               │ debg created user ft_ml_viewer_space1
[00:00:01]               │ debg creating user ft_ml_viewer_all_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer_all_spaces]
[00:00:01]               │ debg created user ft_ml_viewer_all_spaces
[00:00:01]               │ debg creating user ft_ml_unauthorized
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_unauthorized]
[00:00:01]               │ debg created user ft_ml_unauthorized
[00:00:01]               │ debg creating user ft_ml_unauthorized_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_unauthorized_spaces]
[00:00:01]               │ debg created user ft_ml_unauthorized_spaces
[00:08:50]             └-: anomaly detection
[00:08:50]               └-> "before all" hook in "anomaly detection"
[00:23:29]               └-: advanced job
[00:23:29]                 └-> "before all" hook in "advanced job"
[00:23:29]                 └-> "before all" hook in "advanced job"
[00:23:29]                   │ info [x-pack/test/functional/es_archives/ml/ecommerce] Loading "mappings.json"
[00:23:29]                   │ info [x-pack/test/functional/es_archives/ml/ecommerce] Loading "data.json.gz"
[00:23:29]                   │ info [x-pack/test/functional/es_archives/ml/ecommerce] Skipped restore for existing index "ft_ecommerce"
[00:23:29]                   │ debg Searching for 'index-pattern' with title 'ft_ecommerce'...
[00:23:29]                   │ debg  > Found '1c2284a0-0a6d-11ec-94ba-67469095791d'
[00:23:29]                   │ debg Index pattern with title 'ft_ecommerce' already exists. Nothing to create.
[00:23:29]                   │ debg applying update to kibana config: {"dateFormat:tz":"UTC"}
[00:23:30]                   │ debg Creating calendar with id 'wizard-test-calendar_1630421550819'...
[00:23:30]                   │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-meta] creating index, cause [auto(bulk api)], templates [], shards [1]/[1]
[00:23:30]                   │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-meta]
[00:23:30]                   │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-annotations-6] creating index, cause [api], templates [], shards [1]/[1]
[00:23:30]                   │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-annotations-6]
[00:23:30]                   │ debg Waiting up to 5000ms for 'wizard-test-calendar_1630421550819' to exist...
[00:23:30]                   │ debg > Calendar created.
[00:23:30]                   │ debg SecurityPage.forceLogout
[00:23:30]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=100
[00:23:30]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:23:30]                   │ debg Redirecting to /logout to force the logout
[00:23:31]                   │ debg Waiting on the login form to appear
[00:23:31]                   │ debg Waiting for Login Page to appear.
[00:23:31]                   │ debg Waiting up to 100000ms for login page...
[00:23:31]                   │ debg browser[INFO] http://localhost:61231/logout?_t=1630422962810 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:23:31]                   │
[00:23:31]                   │ debg browser[INFO] http://localhost:61231/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:23:31]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:23:33]                   │ debg browser[INFO] http://localhost:61231/login?msg=LOGGED_OUT 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:23:33]                   │
[00:23:33]                   │ debg browser[INFO] http://localhost:61231/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:23:33]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:23:34]                   │ERROR browser[SEVERE] http://localhost:61231/api/licensing/info - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:23:34]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:23:34]                   │ debg TestSubjects.exists(loginForm)
[00:23:34]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:23:34]                   │ debg Waiting for Login Form to appear.
[00:23:34]                   │ debg Waiting up to 100000ms for login form...
[00:23:34]                   │ debg TestSubjects.exists(loginForm)
[00:23:34]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:23:34]                   │ debg TestSubjects.setValue(loginUsername, ft_ml_poweruser)
[00:23:34]                   │ debg TestSubjects.click(loginUsername)
[00:23:34]                   │ debg Find.clickByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:23:34]                   │ debg Find.findByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:23:35]                   │ debg TestSubjects.setValue(loginPassword, mlp001)
[00:23:35]                   │ debg TestSubjects.click(loginPassword)
[00:23:35]                   │ debg Find.clickByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:23:35]                   │ debg Find.findByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:23:35]                   │ debg TestSubjects.click(loginSubmit)
[00:23:35]                   │ debg Find.clickByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:23:35]                   │ debg Find.findByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:23:35]                   │ debg Waiting for login result, expected: chrome.
[00:23:35]                   │ debg Find.findByCssSelector('[data-test-subj="userMenuAvatar"]') with timeout=20000
[00:23:35]                   │ proc [kibana]   log   [15:16:07.292] [info][plugins][routes][security] Logging in with provider "basic" (basic)
[00:23:45]                   │ debg browser[INFO] http://localhost:61231/app/home 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:23:45]                   │
[00:23:45]                   │ debg browser[INFO] http://localhost:61231/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:23:45]                   │ERROR browser[SEVERE] http://localhost:61231/internal/security/me - Failed to load resource: net::ERR_NETWORK_CHANGED
[00:23:45]                   │ debg browser[INFO] http://localhost:61231/45804/bundles/core/core.entry.js 12:151634 "Detected an unhandled Promise rejection.
[00:23:45]                   │      TypeError: Failed to fetch"
[00:23:45]                   │ERROR browser[SEVERE] http://localhost:61231/45804/bundles/core/core.entry.js 5:2752 
[00:23:55]                   │ info Taking screenshot "/dev/shm/workspace/parallel/23/kibana/x-pack/test/functional/screenshots/failure/machine learning  anomaly detection advanced job _before all_ hook in _advanced job_.png"
[00:23:55]                   │ info Current URL is: http://localhost:61231/app/home#/
[00:23:55]                   │ info Saving page source to: /dev/shm/workspace/parallel/23/kibana/x-pack/test/functional/failure_debug/html/machine learning  anomaly detection advanced job _before all_ hook in _advanced job_.html
[00:23:55]                   └- ✖ fail: machine learning  anomaly detection advanced job "before all" hook in "advanced job"
[00:23:55]                   │      TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="userMenuAvatar"])
[00:23:55]                   │ Wait timed out after 20288ms
[00:23:55]                   │       at /dev/shm/workspace/parallel/23/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
[00:23:55]                   │       at runMicrotasks (<anonymous>)
[00:23:55]                   │       at processTicksAndRejections (internal/process/task_queues.js:95:5)
[00:23:55]                   │ 
[00:23:55]                   │ 

Stack Trace

TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="userMenuAvatar"])
Wait timed out after 20288ms
    at /dev/shm/workspace/parallel/23/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:95:5) {
  remoteStacktrace: ''
}

Kibana Pipeline / general / X-Pack Reporting API Integration Tests.x-pack/test/reporting_api_integration/reporting_and_security/ilm_migration_apis·ts.Reporting APIs ILM policy migration APIs detects when no migration is needed

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: Reporting APIs
[00:00:00]           └-> "before all" hook in "Reporting APIs"
[00:00:00]           └-> "before all" hook in "Reporting APIs"
[00:00:00]             │ debg creating role data_analyst
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [data_analyst]
[00:00:00]             │ debg creating role test_reporting_user
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [test_reporting_user]
[00:00:00]             │ debg creating user data_analyst
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [data_analyst]
[00:00:00]             │ debg created user data_analyst
[00:00:00]             │ debg creating user reporting_user
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [reporting_user]
[00:00:00]             │ debg created user reporting_user
[00:04:42]           └-: ILM policy migration APIs
[00:04:42]             └-> "before all" hook for "detects when no migration is needed"
[00:04:42]             └-> "before all" hook for "detects when no migration is needed"
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Loading "mappings.json"
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Loading "data.json.gz"
[00:04:42]               │ info [o.e.c.m.MetadataDeleteIndexService] [node-01] [.kibana_task_manager_8.0.0_001/CZPXG3hnTMC3qOC-LwNPWA] deleting index
[00:04:42]               │ info [o.e.c.m.MetadataDeleteIndexService] [node-01] [.kibana_8.0.0_001/8iwPiFzmQmax1SYQRTmT6g] deleting index
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Deleted existing index ".kibana_8.0.0_001"
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Deleted existing index ".kibana_task_manager_8.0.0_001"
[00:04:42]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_1] creating index, cause [api], templates [], shards [1]/[1]
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Created index ".kibana_1"
[00:04:42]               │ debg [x-pack/test/functional/es_archives/reporting/logs] ".kibana_1" settings {"index":{"number_of_replicas":"1","number_of_shards":"1"}}
[00:04:42]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_1/UunIKJd0Rt-1r4wcUs7FyA] update_mapping [_doc]
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Indexed 2 docs into ".kibana"
[00:04:42]               │ debg Migrating saved objects
[00:04:42]               │ proc [kibana]   log   [15:58:34.538] [info][savedobjects-service] [.kibana_task_manager] INIT -> CREATE_NEW_TARGET. took: 2ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.540] [info][savedobjects-service] [.kibana] INIT -> WAIT_FOR_YELLOW_SOURCE. took: 6ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.542] [info][savedobjects-service] [.kibana] WAIT_FOR_YELLOW_SOURCE -> CHECK_UNKNOWN_DOCUMENTS. took: 2ms.
[00:04:42]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_task_manager_8.0.0_001] creating index, cause [api], templates [], shards [1]/[1]
[00:04:42]               │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.kibana_task_manager_8.0.0_001]
[00:04:42]               │ proc [kibana]   log   [15:58:34.548] [info][savedobjects-service] [.kibana] CHECK_UNKNOWN_DOCUMENTS -> SET_SOURCE_WRITE_BLOCK. took: 6ms.
[00:04:42]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] adding block write to indices [[.kibana_1/UunIKJd0Rt-1r4wcUs7FyA]]
[00:04:42]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] completed adding block write to indices [.kibana_1]
[00:04:42]               │ proc [kibana]   log   [15:58:34.613] [info][savedobjects-service] [.kibana_task_manager] CREATE_NEW_TARGET -> MARK_VERSION_INDEX_READY. took: 75ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.628] [info][savedobjects-service] [.kibana] SET_SOURCE_WRITE_BLOCK -> CALCULATE_EXCLUDE_FILTERS. took: 80ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.632] [info][savedobjects-service] [.kibana] CALCULATE_EXCLUDE_FILTERS -> CREATE_REINDEX_TEMP. took: 4ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.645] [info][savedobjects-service] [.kibana_task_manager] MARK_VERSION_INDEX_READY -> DONE. took: 32ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.646] [info][savedobjects-service] [.kibana_task_manager] Migration completed after 110ms
[00:04:43]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_8.0.0_reindex_temp] creating index, cause [api], templates [], shards [1]/[1]
[00:04:43]               │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.kibana_8.0.0_reindex_temp]
[00:04:43]               │ proc [kibana]   log   [15:58:34.696] [info][savedobjects-service] [.kibana] CREATE_REINDEX_TEMP -> REINDEX_SOURCE_TO_TEMP_OPEN_PIT. took: 63ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.699] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_OPEN_PIT -> REINDEX_SOURCE_TO_TEMP_READ. took: 4ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.704] [info][savedobjects-service] [.kibana] Starting to process 2 documents.
[00:04:43]               │ proc [kibana]   log   [15:58:34.705] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_READ -> REINDEX_SOURCE_TO_TEMP_INDEX. took: 5ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.714] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_INDEX -> REINDEX_SOURCE_TO_TEMP_INDEX_BULK. took: 10ms.
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg] update_mapping [_doc]
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg] update_mapping [_doc]
[00:04:43]               │ proc [kibana]   log   [15:58:34.766] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_INDEX_BULK -> REINDEX_SOURCE_TO_TEMP_READ. took: 52ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.772] [info][savedobjects-service] [.kibana] Processed 2 documents out of 2.
[00:04:43]               │ proc [kibana]   log   [15:58:34.773] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_READ -> REINDEX_SOURCE_TO_TEMP_CLOSE_PIT. took: 6ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.775] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_CLOSE_PIT -> SET_TEMP_WRITE_BLOCK. took: 3ms.
[00:04:43]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] adding block write to indices [[.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg]]
[00:04:43]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] completed adding block write to indices [.kibana_8.0.0_reindex_temp]
[00:04:43]               │ proc [kibana]   log   [15:58:34.811] [info][savedobjects-service] [.kibana] SET_TEMP_WRITE_BLOCK -> CLONE_TEMP_TO_TARGET. took: 36ms.
[00:04:43]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] applying create index request using existing index [.kibana_8.0.0_reindex_temp] metadata
[00:04:43]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_8.0.0_001] creating index, cause [clone_index], templates [], shards [1]/[1]
[00:04:43]               │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.kibana_8.0.0_001]
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] create_mapping
[00:04:43]               │ proc [kibana]   log   [15:58:34.899] [info][savedobjects-service] [.kibana] CLONE_TEMP_TO_TARGET -> REFRESH_TARGET. took: 88ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.902] [info][savedobjects-service] [.kibana] REFRESH_TARGET -> OUTDATED_DOCUMENTS_SEARCH_OPEN_PIT. took: 3ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.904] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH_OPEN_PIT -> OUTDATED_DOCUMENTS_SEARCH_READ. took: 2ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.908] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH_READ -> OUTDATED_DOCUMENTS_SEARCH_CLOSE_PIT. took: 4ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.910] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH_CLOSE_PIT -> UPDATE_TARGET_MAPPINGS. took: 2ms.
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:43]               │ proc [kibana]   log   [15:58:34.959] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS -> UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK. took: 49ms.
[00:04:43]               │ info [o.e.t.LoggingTaskListener] [node-01] 23906 finished with response BulkByScrollResponse[took=16.7ms,timed_out=false,sliceId=null,updated=2,created=0,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:04:43]               │ proc [kibana]   log   [15:58:35.064] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK -> MARK_VERSION_INDEX_READY. took: 105ms.
[00:04:43]               │ info [o.e.c.m.MetadataDeleteIndexService] [node-01] [.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg] deleting index
[00:04:43]               │ proc [kibana]   log   [15:58:35.099] [info][savedobjects-service] [.kibana] MARK_VERSION_INDEX_READY -> DONE. took: 35ms.
[00:04:43]               │ proc [kibana]   log   [15:58:35.100] [info][savedobjects-service] [.kibana] Migration completed after 566ms
[00:04:43]               │ debg [x-pack/test/functional/es_archives/reporting/logs] Migrated Kibana index after loading Kibana data
[00:04:44]               │ debg [x-pack/test/functional/es_archives/reporting/logs] Ensured that default space exists in .kibana
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Loading "mappings.json"
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Loading "data.json.gz"
[00:04:44]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [logstash-2015.09.22] creating index, cause [api], templates [], shards [1]/[0]
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Created index "logstash-2015.09.22"
[00:04:44]               │ debg [x-pack/test/functional/es_archives/logstash_functional] "logstash-2015.09.22" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:04:44]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [logstash-2015.09.20] creating index, cause [api], templates [], shards [1]/[0]
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Created index "logstash-2015.09.20"
[00:04:44]               │ debg [x-pack/test/functional/es_archives/logstash_functional] "logstash-2015.09.20" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:04:44]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [logstash-2015.09.21] creating index, cause [api], templates [], shards [1]/[0]
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Created index "logstash-2015.09.21"
[00:04:44]               │ debg [x-pack/test/functional/es_archives/logstash_functional] "logstash-2015.09.21" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:04:44]               │ info [o.e.c.m.MetadataMappingService] [node-01] [logstash-2015.09.21/O1r1sTtwReWhbTueOj4dnw] update_mapping [_doc]
[00:04:45]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_task_manager_8.0.0_001/3w8jiA9sRuK8cPjyZNX6XA] update_mapping [_doc]
[00:04:47]               │ info [o.e.c.m.MetadataMappingService] [node-01] [logstash-2015.09.20/StdE16uZQ8W7xVf5mGpIHg] update_mapping [_doc]
[00:04:48]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:48]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:48]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:48]               │ proc [kibana]   log   [15:58:40.351] [warning][collector-set][plugins][usage-collection][usageCollection] ReferenceError: Cannot access 'error' before initialization
[00:04:48]               │ proc [kibana]     at /dev/shm/workspace/kibana-build-24/x-pack/plugins/fleet/server/collectors/fleet_server_collector.js:43:27
[00:04:48]               │ proc [kibana]     at runMicrotasks (<anonymous>)
[00:04:48]               │ proc [kibana]     at processTicksAndRejections (internal/process/task_queues.js:95:5)
[00:04:48]               │ proc [kibana]     at getFleetServerUsage (/dev/shm/workspace/kibana-build-24/x-pack/plugins/fleet/server/collectors/fleet_server_collector.js:38:24)
[00:04:48]               │ proc [kibana]     at UsageCollector.fetch (/dev/shm/workspace/kibana-build-24/x-pack/plugins/fleet/server/collectors/register.js:41:23)
[00:04:48]               │ proc [kibana]     at /dev/shm/workspace/kibana-build-24/src/plugins/usage_collection/server/collector/collector_set.js:104:21
[00:04:48]               │ proc [kibana]     at async Promise.all (index 32)
[00:04:48]               │ proc [kibana]     at Object.bulkFetch (/dev/shm/workspace/kibana-build-24/src/plugins/usage_collection/server/collector/collector_set.js:91:25)
[00:04:48]               │ proc [kibana]     at getKibana (/dev/shm/workspace/kibana-build-24/src/plugins/telemetry/server/telemetry_collection/get_kibana.js:68:17)
[00:04:48]               │ proc [kibana]     at async Promise.all (index 3)
[00:04:48]               │ proc [kibana]     at /dev/shm/workspace/kibana-build-24/src/plugins/telemetry/server/telemetry_collection/get_local_stats.js:79:76
[00:04:48]               │ proc [kibana]     at async Promise.all (index 0)
[00:04:48]               │ proc [kibana]     at getLocalStats (/dev/shm/workspace/kibana-build-24/src/plugins/telemetry/server/telemetry_collection/get_local_stats.js:78:10)
[00:04:48]               │ proc [kibana]     at async Promise.all (index 0)
[00:04:48]               │ proc [kibana]     at Object.getStatsWithXpack [as statsGetter] (/dev/shm/workspace/kibana-build-24/x-pack/plugins/telemetry_collection_xpack/server/telemetry_collection/get_stats_with_xpack.js:27:48)
[00:04:48]               │ proc [kibana]     at TelemetryCollectionManagerPlugin.getUsageForCollection (/dev/shm/workspace/kibana-build-24/src/plugins/telemetry_collection_manager/server/plugin.js:256:19)
[00:04:48]               │ proc [kibana]   log   [15:58:40.353] [warning][collector-set][plugins][usage-collection][usageCollection] Unable to fetch data from fleet collector
[00:04:53]               │ info [x-pack/test/functional/es_archives/logstash_functional] Indexed 4634 docs into "logstash-2015.09.22"
[00:04:53]               │ info [x-pack/test/functional/es_archives/logstash_functional] Indexed 4757 docs into "logstash-2015.09.20"
[00:04:53]               │ info [x-pack/test/functional/es_archives/logstash_functional] Indexed 4614 docs into "logstash-2015.09.21"
[00:04:54]             └-> detects when no migration is needed
[00:04:54]               └-> "before each" hook: global before each for "detects when no migration is needed"
[00:04:54]               │ debg ReportingAPI.checkIlmMigrationStatus
[00:04:54]               └- ✖ fail: Reporting APIs ILM policy migration APIs detects when no migration is needed
[00:04:54]               │      Error: expected 200 "OK", got 404 "Not Found"
[00:04:54]               │       at Test._assertStatus (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:268:12)
[00:04:54]               │       at Test._assertFunction (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:283:11)
[00:04:54]               │       at Test.assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:173:18)
[00:04:54]               │       at assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:131:12)
[00:04:54]               │       at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:128:5
[00:04:54]               │       at Test.Request.callback (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
[00:04:54]               │       at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
[00:04:54]               │       at IncomingMessage.<anonymous> (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
[00:04:54]               │       at endReadableNT (internal/streams/readable.js:1317:12)
[00:04:54]               │       at processTicksAndRejections (internal/process/task_queues.js:82:21)
[00:04:54]               │ 
[00:04:54]               │ 

Stack Trace

Error: expected 200 "OK", got 404 "Not Found"
    at Test._assertStatus (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:268:12)
    at Test._assertFunction (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:283:11)
    at Test.assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:173:18)
    at assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:131:12)
    at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:128:5
    at Test.Request.callback (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
    at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
    at IncomingMessage.<anonymous> (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
    at endReadableNT (internal/streams/readable.js:1317:12)
    at processTicksAndRejections (internal/process/task_queues.js:82:21)

Kibana Pipeline / general / X-Pack Reporting API Integration Tests.x-pack/test/reporting_api_integration/reporting_and_security/ilm_migration_apis·ts.Reporting APIs ILM policy migration APIs "after each" hook for "detects when no migration is needed"

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: Reporting APIs
[00:00:00]           └-> "before all" hook in "Reporting APIs"
[00:00:00]           └-> "before all" hook in "Reporting APIs"
[00:00:00]             │ debg creating role data_analyst
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [data_analyst]
[00:00:00]             │ debg creating role test_reporting_user
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [test_reporting_user]
[00:00:00]             │ debg creating user data_analyst
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [data_analyst]
[00:00:00]             │ debg created user data_analyst
[00:00:00]             │ debg creating user reporting_user
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [reporting_user]
[00:00:00]             │ debg created user reporting_user
[00:04:42]           └-: ILM policy migration APIs
[00:04:42]             └-> "before all" hook for "detects when no migration is needed"
[00:04:42]             └-> "before all" hook for "detects when no migration is needed"
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Loading "mappings.json"
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Loading "data.json.gz"
[00:04:42]               │ info [o.e.c.m.MetadataDeleteIndexService] [node-01] [.kibana_task_manager_8.0.0_001/CZPXG3hnTMC3qOC-LwNPWA] deleting index
[00:04:42]               │ info [o.e.c.m.MetadataDeleteIndexService] [node-01] [.kibana_8.0.0_001/8iwPiFzmQmax1SYQRTmT6g] deleting index
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Deleted existing index ".kibana_8.0.0_001"
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Deleted existing index ".kibana_task_manager_8.0.0_001"
[00:04:42]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_1] creating index, cause [api], templates [], shards [1]/[1]
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Created index ".kibana_1"
[00:04:42]               │ debg [x-pack/test/functional/es_archives/reporting/logs] ".kibana_1" settings {"index":{"number_of_replicas":"1","number_of_shards":"1"}}
[00:04:42]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_1/UunIKJd0Rt-1r4wcUs7FyA] update_mapping [_doc]
[00:04:42]               │ info [x-pack/test/functional/es_archives/reporting/logs] Indexed 2 docs into ".kibana"
[00:04:42]               │ debg Migrating saved objects
[00:04:42]               │ proc [kibana]   log   [15:58:34.538] [info][savedobjects-service] [.kibana_task_manager] INIT -> CREATE_NEW_TARGET. took: 2ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.540] [info][savedobjects-service] [.kibana] INIT -> WAIT_FOR_YELLOW_SOURCE. took: 6ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.542] [info][savedobjects-service] [.kibana] WAIT_FOR_YELLOW_SOURCE -> CHECK_UNKNOWN_DOCUMENTS. took: 2ms.
[00:04:42]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_task_manager_8.0.0_001] creating index, cause [api], templates [], shards [1]/[1]
[00:04:42]               │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.kibana_task_manager_8.0.0_001]
[00:04:42]               │ proc [kibana]   log   [15:58:34.548] [info][savedobjects-service] [.kibana] CHECK_UNKNOWN_DOCUMENTS -> SET_SOURCE_WRITE_BLOCK. took: 6ms.
[00:04:42]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] adding block write to indices [[.kibana_1/UunIKJd0Rt-1r4wcUs7FyA]]
[00:04:42]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] completed adding block write to indices [.kibana_1]
[00:04:42]               │ proc [kibana]   log   [15:58:34.613] [info][savedobjects-service] [.kibana_task_manager] CREATE_NEW_TARGET -> MARK_VERSION_INDEX_READY. took: 75ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.628] [info][savedobjects-service] [.kibana] SET_SOURCE_WRITE_BLOCK -> CALCULATE_EXCLUDE_FILTERS. took: 80ms.
[00:04:42]               │ proc [kibana]   log   [15:58:34.632] [info][savedobjects-service] [.kibana] CALCULATE_EXCLUDE_FILTERS -> CREATE_REINDEX_TEMP. took: 4ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.645] [info][savedobjects-service] [.kibana_task_manager] MARK_VERSION_INDEX_READY -> DONE. took: 32ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.646] [info][savedobjects-service] [.kibana_task_manager] Migration completed after 110ms
[00:04:43]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_8.0.0_reindex_temp] creating index, cause [api], templates [], shards [1]/[1]
[00:04:43]               │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.kibana_8.0.0_reindex_temp]
[00:04:43]               │ proc [kibana]   log   [15:58:34.696] [info][savedobjects-service] [.kibana] CREATE_REINDEX_TEMP -> REINDEX_SOURCE_TO_TEMP_OPEN_PIT. took: 63ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.699] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_OPEN_PIT -> REINDEX_SOURCE_TO_TEMP_READ. took: 4ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.704] [info][savedobjects-service] [.kibana] Starting to process 2 documents.
[00:04:43]               │ proc [kibana]   log   [15:58:34.705] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_READ -> REINDEX_SOURCE_TO_TEMP_INDEX. took: 5ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.714] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_INDEX -> REINDEX_SOURCE_TO_TEMP_INDEX_BULK. took: 10ms.
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg] update_mapping [_doc]
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg] update_mapping [_doc]
[00:04:43]               │ proc [kibana]   log   [15:58:34.766] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_INDEX_BULK -> REINDEX_SOURCE_TO_TEMP_READ. took: 52ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.772] [info][savedobjects-service] [.kibana] Processed 2 documents out of 2.
[00:04:43]               │ proc [kibana]   log   [15:58:34.773] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_READ -> REINDEX_SOURCE_TO_TEMP_CLOSE_PIT. took: 6ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.775] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_CLOSE_PIT -> SET_TEMP_WRITE_BLOCK. took: 3ms.
[00:04:43]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] adding block write to indices [[.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg]]
[00:04:43]               │ info [o.e.c.m.MetadataIndexStateService] [node-01] completed adding block write to indices [.kibana_8.0.0_reindex_temp]
[00:04:43]               │ proc [kibana]   log   [15:58:34.811] [info][savedobjects-service] [.kibana] SET_TEMP_WRITE_BLOCK -> CLONE_TEMP_TO_TARGET. took: 36ms.
[00:04:43]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] applying create index request using existing index [.kibana_8.0.0_reindex_temp] metadata
[00:04:43]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.kibana_8.0.0_001] creating index, cause [clone_index], templates [], shards [1]/[1]
[00:04:43]               │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.kibana_8.0.0_001]
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] create_mapping
[00:04:43]               │ proc [kibana]   log   [15:58:34.899] [info][savedobjects-service] [.kibana] CLONE_TEMP_TO_TARGET -> REFRESH_TARGET. took: 88ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.902] [info][savedobjects-service] [.kibana] REFRESH_TARGET -> OUTDATED_DOCUMENTS_SEARCH_OPEN_PIT. took: 3ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.904] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH_OPEN_PIT -> OUTDATED_DOCUMENTS_SEARCH_READ. took: 2ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.908] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH_READ -> OUTDATED_DOCUMENTS_SEARCH_CLOSE_PIT. took: 4ms.
[00:04:43]               │ proc [kibana]   log   [15:58:34.910] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH_CLOSE_PIT -> UPDATE_TARGET_MAPPINGS. took: 2ms.
[00:04:43]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:43]               │ proc [kibana]   log   [15:58:34.959] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS -> UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK. took: 49ms.
[00:04:43]               │ info [o.e.t.LoggingTaskListener] [node-01] 23906 finished with response BulkByScrollResponse[took=16.7ms,timed_out=false,sliceId=null,updated=2,created=0,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:04:43]               │ proc [kibana]   log   [15:58:35.064] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK -> MARK_VERSION_INDEX_READY. took: 105ms.
[00:04:43]               │ info [o.e.c.m.MetadataDeleteIndexService] [node-01] [.kibana_8.0.0_reindex_temp/Dhd_qSytRjmC8Xaio_9qfg] deleting index
[00:04:43]               │ proc [kibana]   log   [15:58:35.099] [info][savedobjects-service] [.kibana] MARK_VERSION_INDEX_READY -> DONE. took: 35ms.
[00:04:43]               │ proc [kibana]   log   [15:58:35.100] [info][savedobjects-service] [.kibana] Migration completed after 566ms
[00:04:43]               │ debg [x-pack/test/functional/es_archives/reporting/logs] Migrated Kibana index after loading Kibana data
[00:04:44]               │ debg [x-pack/test/functional/es_archives/reporting/logs] Ensured that default space exists in .kibana
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Loading "mappings.json"
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Loading "data.json.gz"
[00:04:44]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [logstash-2015.09.22] creating index, cause [api], templates [], shards [1]/[0]
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Created index "logstash-2015.09.22"
[00:04:44]               │ debg [x-pack/test/functional/es_archives/logstash_functional] "logstash-2015.09.22" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:04:44]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [logstash-2015.09.20] creating index, cause [api], templates [], shards [1]/[0]
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Created index "logstash-2015.09.20"
[00:04:44]               │ debg [x-pack/test/functional/es_archives/logstash_functional] "logstash-2015.09.20" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:04:44]               │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [logstash-2015.09.21] creating index, cause [api], templates [], shards [1]/[0]
[00:04:44]               │ info [x-pack/test/functional/es_archives/logstash_functional] Created index "logstash-2015.09.21"
[00:04:44]               │ debg [x-pack/test/functional/es_archives/logstash_functional] "logstash-2015.09.21" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:04:44]               │ info [o.e.c.m.MetadataMappingService] [node-01] [logstash-2015.09.21/O1r1sTtwReWhbTueOj4dnw] update_mapping [_doc]
[00:04:45]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_task_manager_8.0.0_001/3w8jiA9sRuK8cPjyZNX6XA] update_mapping [_doc]
[00:04:47]               │ info [o.e.c.m.MetadataMappingService] [node-01] [logstash-2015.09.20/StdE16uZQ8W7xVf5mGpIHg] update_mapping [_doc]
[00:04:48]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:48]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:48]               │ info [o.e.c.m.MetadataMappingService] [node-01] [.kibana_8.0.0_001/EPqV4PIrRQKEsMXiahELaQ] update_mapping [_doc]
[00:04:48]               │ proc [kibana]   log   [15:58:40.351] [warning][collector-set][plugins][usage-collection][usageCollection] ReferenceError: Cannot access 'error' before initialization
[00:04:48]               │ proc [kibana]     at /dev/shm/workspace/kibana-build-24/x-pack/plugins/fleet/server/collectors/fleet_server_collector.js:43:27
[00:04:48]               │ proc [kibana]     at runMicrotasks (<anonymous>)
[00:04:48]               │ proc [kibana]     at processTicksAndRejections (internal/process/task_queues.js:95:5)
[00:04:48]               │ proc [kibana]     at getFleetServerUsage (/dev/shm/workspace/kibana-build-24/x-pack/plugins/fleet/server/collectors/fleet_server_collector.js:38:24)
[00:04:48]               │ proc [kibana]     at UsageCollector.fetch (/dev/shm/workspace/kibana-build-24/x-pack/plugins/fleet/server/collectors/register.js:41:23)
[00:04:48]               │ proc [kibana]     at /dev/shm/workspace/kibana-build-24/src/plugins/usage_collection/server/collector/collector_set.js:104:21
[00:04:48]               │ proc [kibana]     at async Promise.all (index 32)
[00:04:48]               │ proc [kibana]     at Object.bulkFetch (/dev/shm/workspace/kibana-build-24/src/plugins/usage_collection/server/collector/collector_set.js:91:25)
[00:04:48]               │ proc [kibana]     at getKibana (/dev/shm/workspace/kibana-build-24/src/plugins/telemetry/server/telemetry_collection/get_kibana.js:68:17)
[00:04:48]               │ proc [kibana]     at async Promise.all (index 3)
[00:04:48]               │ proc [kibana]     at /dev/shm/workspace/kibana-build-24/src/plugins/telemetry/server/telemetry_collection/get_local_stats.js:79:76
[00:04:48]               │ proc [kibana]     at async Promise.all (index 0)
[00:04:48]               │ proc [kibana]     at getLocalStats (/dev/shm/workspace/kibana-build-24/src/plugins/telemetry/server/telemetry_collection/get_local_stats.js:78:10)
[00:04:48]               │ proc [kibana]     at async Promise.all (index 0)
[00:04:48]               │ proc [kibana]     at Object.getStatsWithXpack [as statsGetter] (/dev/shm/workspace/kibana-build-24/x-pack/plugins/telemetry_collection_xpack/server/telemetry_collection/get_stats_with_xpack.js:27:48)
[00:04:48]               │ proc [kibana]     at TelemetryCollectionManagerPlugin.getUsageForCollection (/dev/shm/workspace/kibana-build-24/src/plugins/telemetry_collection_manager/server/plugin.js:256:19)
[00:04:48]               │ proc [kibana]   log   [15:58:40.353] [warning][collector-set][plugins][usage-collection][usageCollection] Unable to fetch data from fleet collector
[00:04:53]               │ info [x-pack/test/functional/es_archives/logstash_functional] Indexed 4634 docs into "logstash-2015.09.22"
[00:04:53]               │ info [x-pack/test/functional/es_archives/logstash_functional] Indexed 4757 docs into "logstash-2015.09.20"
[00:04:53]               │ info [x-pack/test/functional/es_archives/logstash_functional] Indexed 4614 docs into "logstash-2015.09.21"
[00:04:54]             └-> detects when no migration is needed
[00:04:54]               └-> "before each" hook: global before each for "detects when no migration is needed"
[00:04:54]               │ debg ReportingAPI.checkIlmMigrationStatus
[00:04:54]               └- ✖ fail: Reporting APIs ILM policy migration APIs detects when no migration is needed
[00:04:54]               │      Error: expected 200 "OK", got 404 "Not Found"
[00:04:54]               │       at Test._assertStatus (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:268:12)
[00:04:54]               │       at Test._assertFunction (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:283:11)
[00:04:54]               │       at Test.assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:173:18)
[00:04:54]               │       at assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:131:12)
[00:04:54]               │       at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:128:5
[00:04:54]               │       at Test.Request.callback (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
[00:04:54]               │       at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
[00:04:54]               │       at IncomingMessage.<anonymous> (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
[00:04:54]               │       at endReadableNT (internal/streams/readable.js:1317:12)
[00:04:54]               │       at processTicksAndRejections (internal/process/task_queues.js:82:21)
[00:04:54]               │ 
[00:04:54]               │ 
[00:04:54]             └-> "after each" hook for "detects when no migration is needed"
[00:04:54]               │ debg ReportingAPI.deleteAllReports
[00:04:54]               │ debg ReportingAPI.migrateReportingIndices
[00:04:54]               └- ✖ fail: Reporting APIs ILM policy migration APIs "after each" hook for "detects when no migration is needed"
[00:04:54]               │      Error: expected 200 "OK", got 404 "Not Found"
[00:04:54]               │       at Test._assertStatus (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:268:12)
[00:04:54]               │       at Test._assertFunction (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:283:11)
[00:04:54]               │       at Test.assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:173:18)
[00:04:54]               │       at assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:131:12)
[00:04:54]               │       at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:128:5
[00:04:54]               │       at Test.Request.callback (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
[00:04:54]               │       at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
[00:04:54]               │       at IncomingMessage.<anonymous> (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
[00:04:54]               │       at endReadableNT (internal/streams/readable.js:1317:12)
[00:04:54]               │       at processTicksAndRejections (internal/process/task_queues.js:82:21)
[00:04:54]               │ 
[00:04:54]               │ 

Stack Trace

Error: expected 200 "OK", got 404 "Not Found"
    at Test._assertStatus (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:268:12)
    at Test._assertFunction (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:283:11)
    at Test.assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:173:18)
    at assert (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:131:12)
    at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/lib/test.js:128:5
    at Test.Request.callback (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
    at /dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
    at IncomingMessage.<anonymous> (/dev/shm/workspace/parallel/24/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
    at endReadableNT (internal/streams/readable.js:1317:12)
    at processTicksAndRejections (internal/process/task_queues.js:82:21)

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

sorenlouv pushed a commit that referenced this pull request Aug 31, 2021
…110631)

Co-authored-by: Miriam <31922082+MiriamAparicio@users.noreply.github.com>
@dgieselaar dgieselaar added v7.15.2 auto-backport Deprecated - use backport:version if exact versions are needed and removed auto-backport Deprecated - use backport:version if exact versions are needed labels Oct 21, 2021
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Oct 21, 2021
@kibanamachine
Copy link
Contributor

💔 Backport failed

Status Branch Result
7.16 Cherrypick failed because the selected commit (c568a43) is empty. Did you already backport this commit?
7.15

Successful backport PRs will be merged automatically after passing CI.

To backport manually run:
node scripts/backport --pr 110480

dgieselaar pushed a commit to dgieselaar/kibana that referenced this pull request Oct 21, 2021
kibanamachine added a commit that referenced this pull request Oct 21, 2021
Co-authored-by: Miriam <31922082+MiriamAparicio@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
apm:performance APM UI - Performance Work auto-backport Deprecated - use backport:version if exact versions are needed release_note:skip Skip the PR/issue when compiling release notes Team:APM All issues that need APM UI Team support v7.15.2 v7.16.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[APM] Set preference to any value for APM searches
5 participants