Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[App Search] API Logs: Add ApiLogsTable and NewApiEventsPrompt components #96008

Merged
merged 7 commits into from
Apr 2, 2021

Conversation

cee-chen
Copy link
Member

@cee-chen cee-chen commented Mar 31, 2021

Summary

Second-to-last PR in the series. The next one adds the "Details" view functionality.

Screencaps

paginating

new_event

Engine Overview:

QA

  • Go to the documents view of any engine and perform at least 10 different searches
  • Go to the API logs view
  • Confirm that the searches are displayed with approximately the correct timestamp & pagination works as expected
  • Leave the previous API logs open
  • Open a new tab and perform a document search in said new tab
  • Confirm that after 5 seconds, the new search API event appears - the "New events" prompt should show up, and clicking refresh should update the table
  • Go to the Engine Overview
  • Confirm that the API logs table show there (with no pagination), and that new document searches also trigger a "New events" prompt

Checklist

@cee-chen cee-chen added Feature:Plugins release_note:skip Skip the PR/issue when compiling release notes v7.13.0 auto-backport Deprecated - use backport:version if exact versions are needed labels Mar 31, 2021
@cee-chen cee-chen requested a review from a team March 31, 2021 23:18
Comment on lines +59 to +61
<EuiPageContent hasBorder>
<EuiPageContentBody>
<EuiFlexGroup gutterSize="m" alignItems="center" responsive={false} wrap>
Copy link
Member Author

@cee-chen cee-chen Mar 31, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything inside here is mostly the same as before, I just added EuipageContent and EuiPageContentBody wrappers - I recommend viewing with hidden whitespace changes for easier diffing

@@ -0,0 +1,3 @@
.apiLogDetailButton {
height: $euiSizeL !important;
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With:

Without:

I was trying to maintain the more compact rows from our standalone UI (as well as matching our other tables), but if we think that's overkill I can get rid of it.

FWIW without the ! important I need 3 levels of nesting to override EUI's styles, so I took the lazy way out 😬

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it were me, I'd say just let EUI do it's thing. However, if there's something unique about this table that is forcing the rows to be taller than usual, then I'd say this is a worthwhile change.

It's up to you though. Is it worthwhile to drop a comment in the CSS about why you have added that or nah?

To be clear, I'm good either way, I'll approve it regardless.

Copy link
Member Author

@cee-chen cee-chen Apr 1, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first screenshot more closely matches the row heights in all their examples in their docs. The main difference is that their buttons are all icon buttons and not empty buttons 🤷 It does seem odd that EUI wouldn't have any affordance OOTB for EuiEmptyButton matching the same line height as an EuiLink.

A comment makes a ton of sense! I'll add one here in a bit.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines +2 to +3
padding: $euiSizeXS;
padding-left: $euiSizeS;
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI: without this custom padding, the height of the panel exceeds the height of the row height and causes the page to bounce around awkwardly, so I added it as a polish item

Comment on lines +14 to +21
export const getStatusColor = (status: number) => {
let color = '';
if (status >= 100 && status < 300) color = 'secondary';
if (status >= 300 && status < 400) color = 'primary';
if (status >= 400 && status < 500) color = 'warning';
if (status >= 500) color = 'danger';
return color;
};
Copy link
Member Author

@cee-chen cee-chen Mar 31, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I basically just copied this util over as-is from the standalone UI. Def. open to thoughts or suggestions if you have any changes you'd like to see

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fine with me

@cee-chen
Copy link
Member Author

cee-chen commented Apr 1, 2021

@elasticmachine merge upstream

}
>
TODO: API Logs Table
{/* <ApiLogsTable hidePagination={true} /> */}
<ApiLogsTable />
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's really cool that we were able to reuse this in both places.

Copy link
Member

@JasonStoltz JasonStoltz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but I think it would be worthwhile to consider switching your test to mock FormattedRelative to avoid those console warnings.

Comment on lines +94 to +95
wrapper.find('[data-test-subj="ApiLogsTableDetailsButton"]').first().simulate('click');
// TODO: API log details flyout
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So it looks like you intend to test the flyout opening in the primary renders test?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops, just realized I forgot to respond to this - sorry Jason! Yeah the tables are a bit weird testing-wise and mount is a somewhat-expensive render so I figured just throw it in the main test. I can move it out though if we prefer

Comment on lines +14 to +21
export const getStatusColor = (status: number) => {
let color = '';
if (status >= 100 && status < 300) color = 'secondary';
if (status >= 300 && status < 400) color = 'primary';
if (status >= 400 && status < 500) color = 'warning';
if (status >= 500) color = 'danger';
return color;
};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fine with me

@cee-chen cee-chen enabled auto-merge (squash) April 1, 2021 22:57
Copy link
Member

@JasonStoltz JasonStoltz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work Constance!

@cee-chen cee-chen merged commit 7db838e into elastic:master Apr 2, 2021
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Apr 2, 2021
…ents (elastic#96008)

* Set up getStatusColor util for upcoming EuiHealth components

* Add ApiLogsTable component

* Add NewApiEventsPrompt component

* Update ApiLogs view with new components

+ add EuiPageContent wrapper (missed this originally)

* Update EngineOverview with new components

* PR feedback: Comments + mock FormattedRelative

* Fix type error
@kibanamachine
Copy link
Contributor

💚 Backport successful

7.x / #96127

This backport PR will be merged automatically after passing CI.

kibanamachine added a commit that referenced this pull request Apr 2, 2021
…ents (#96008) (#96127)

* Set up getStatusColor util for upcoming EuiHealth components

* Add ApiLogsTable component

* Add NewApiEventsPrompt component

* Update ApiLogs view with new components

+ add EuiPageContent wrapper (missed this originally)

* Update EngineOverview with new components

* PR feedback: Comments + mock FormattedRelative

* Fix type error

Co-authored-by: Constance <constancecchen@users.noreply.github.com>
@cee-chen cee-chen deleted the api-logs-3 branch April 2, 2021 16:19
@kibanamachine
Copy link
Contributor

kibanamachine commented Jun 23, 2021

💔 Build Failed

Failed CI Steps


Test Failures

Kibana Pipeline / general / Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/maps.maps app "before all" hook in "maps app"

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has failed 3 times on tracked branches: https://github.com/elastic/kibana/issues/50387

[00:00:00]       │
[00:00:00]         └-: maps app
[00:00:00]           └-> "before all" hook in "maps app"
[00:00:00]           └-> "before all" hook in "maps app"
[00:00:00]             │ info [logstash_functional] Loading "mappings.json"
[00:00:00]             │ info [logstash_functional] Loading "data.json.gz"
[00:00:00]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [logstash-2015.09.22] creating index, cause [api], templates [], shards [1]/[0]
[00:00:00]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[logstash-2015.09.22][0]]])." previous.health="YELLOW" reason="shards started [[logstash-2015.09.22][0]]"
[00:00:00]             │ info [logstash_functional] Created index "logstash-2015.09.22"
[00:00:00]             │ debg [logstash_functional] "logstash-2015.09.22" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:00]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [logstash-2015.09.20] creating index, cause [api], templates [], shards [1]/[0]
[00:00:00]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[logstash-2015.09.20][0]]])." previous.health="YELLOW" reason="shards started [[logstash-2015.09.20][0]]"
[00:00:00]             │ info [logstash_functional] Created index "logstash-2015.09.20"
[00:00:00]             │ debg [logstash_functional] "logstash-2015.09.20" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:00]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [logstash-2015.09.21] creating index, cause [api], templates [], shards [1]/[0]
[00:00:00]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[logstash-2015.09.21][0]]])." previous.health="YELLOW" reason="shards started [[logstash-2015.09.21][0]]"
[00:00:00]             │ info [logstash_functional] Created index "logstash-2015.09.21"
[00:00:00]             │ debg [logstash_functional] "logstash-2015.09.21" settings {"index":{"analysis":{"analyzer":{"url":{"max_token_length":"1000","tokenizer":"uax_url_email","type":"standard"}}},"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:00]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [logstash-2015.09.21/OhyWhv2ASqGKnl6gGB-9Xw] update_mapping [_doc]
[00:00:03]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [logstash-2015.09.20/gncxWv-aSQGXkJa6fnRtSQ] update_mapping [_doc]
[00:00:10]             │ info progress: 13827
[00:00:10]             │ info [logstash_functional] Indexed 4634 docs into "logstash-2015.09.22"
[00:00:10]             │ info [logstash_functional] Indexed 4757 docs into "logstash-2015.09.20"
[00:00:10]             │ info [logstash_functional] Indexed 4614 docs into "logstash-2015.09.21"
[00:00:10]             │ info [maps/data] Loading "mappings.json"
[00:00:10]             │ info [maps/data] Loading "data.json"
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [geo_shapes] creating index, cause [api], templates [], shards [1]/[0]
[00:00:10]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[geo_shapes][0]]])." previous.health="YELLOW" reason="shards started [[geo_shapes][0]]"
[00:00:10]             │ info [maps/data] Created index "geo_shapes"
[00:00:10]             │ debg [maps/data] "geo_shapes" settings {"index":{"number_of_replicas":"0","number_of_shards":"1","max_result_window":"10001","max_inner_result_window":"101"}}
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [meta_for_geo_shapes] creating index, cause [api], templates [], shards [1]/[0]
[00:00:10]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[meta_for_geo_shapes][0]]])." previous.health="YELLOW" reason="shards started [[meta_for_geo_shapes][0]]"
[00:00:10]             │ info [maps/data] Created index "meta_for_geo_shapes"
[00:00:10]             │ debg [maps/data] "meta_for_geo_shapes" settings {"index":{"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [antimeridian_points] creating index, cause [api], templates [], shards [1]/[0]
[00:00:10]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[antimeridian_points][0]]])." previous.health="YELLOW" reason="shards started [[antimeridian_points][0]]"
[00:00:10]             │ info [maps/data] Created index "antimeridian_points"
[00:00:10]             │ debg [maps/data] "antimeridian_points" settings {"index":{"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [antimeridian_shapes] creating index, cause [api], templates [], shards [1]/[0]
[00:00:10]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[antimeridian_shapes][0]]])." previous.health="YELLOW" reason="shards started [[antimeridian_shapes][0]]"
[00:00:10]             │ info [maps/data] Created index "antimeridian_shapes"
[00:00:10]             │ debg [maps/data] "antimeridian_shapes" settings {"index":{"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [flights] creating index, cause [api], templates [], shards [1]/[0]
[00:00:10]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[flights][0]]])." previous.health="YELLOW" reason="shards started [[flights][0]]"
[00:00:10]             │ info [maps/data] Created index "flights"
[00:00:10]             │ debg [maps/data] "flights" settings {"index":{"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [connections] creating index, cause [api], templates [], shards [1]/[0]
[00:00:10]             │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[connections][0]]])." previous.health="YELLOW" reason="shards started [[connections][0]]"
[00:00:10]             │ info [maps/data] Created index "connections"
[00:00:10]             │ debg [maps/data] "connections" settings {"index":{"number_of_replicas":"0","number_of_shards":"1"}}
[00:00:10]             │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [meta_for_geo_shapes/0BJLxoYyRNW1nffMe0IoxA] update_mapping [_doc]
[00:00:10]             │ info [maps/data] Indexed 4 docs into "geo_shapes"
[00:00:10]             │ info [maps/data] Indexed 6 docs into "meta_for_geo_shapes"
[00:00:10]             │ info [maps/data] Indexed 3 docs into "antimeridian_points"
[00:00:10]             │ info [maps/data] Indexed 3 docs into "antimeridian_shapes"
[00:00:10]             │ info [maps/data] Indexed 3 docs into "flights"
[00:00:10]             │ info [maps/data] Indexed 4 docs into "connections"
[00:00:10]             │ info [maps/kibana] Loading "mappings.json"
[00:00:10]             │ info [maps/kibana] Loading "data.json"
[00:00:10]             │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_8.0.0_001/cHHlxY9MToCQoHZAmm0c0g] deleting index
[00:00:10]             │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_task_manager_8.0.0_001/-JtRk82ERCmfUaSbNAjBdQ] deleting index
[00:00:10]             │ info [maps/kibana] Deleted existing index ".kibana_8.0.0_001"
[00:00:10]             │ info [maps/kibana] Deleted existing index ".kibana_task_manager_8.0.0_001"
[00:00:10]             │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] failed on parsing mappings on index creation [.kibana]
[00:00:10]             │      org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: using deprecated parameters [tree] in mapper [bounds] of type [geo_shape] is no longer allowed
[00:00:10]             │      	at org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:294) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.MapperService.mergeAndApplyMappings(MapperService.java:272) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:256) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:983) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$2(MetadataCreateIndexService.java:408) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:628) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:406) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV1Templates(MetadataCreateIndexService.java:483) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:369) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:376) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:48) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:686) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:308) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:203) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:140) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:139) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:177) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:678) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:241) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:204) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
[00:00:10]             │      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
[00:00:10]             │      	at java.lang.Thread.run(Thread.java:831) [?:?]
[00:00:10]             │      Caused by: java.lang.IllegalArgumentException: using deprecated parameters [tree] in mapper [bounds] of type [geo_shape] is no longer allowed
[00:00:10]             │      	at org.elasticsearch.xpack.spatial.index.mapper.GeoShapeWithDocValuesFieldMapper.lambda$static$0(GeoShapeWithDocValuesFieldMapper.java:189) ~[?:?]
[00:00:10]             │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:318) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:237) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parse(ObjectMapper.java:207) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:318) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:237) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:153) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:98) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:92) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	at org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:292) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:00:10]             │      	... 23 more
[00:00:10]             │ info Taking screenshot "/dev/shm/workspace/parallel/17/kibana/x-pack/test/functional/screenshots/failure/maps app _before all_ hook in _maps app_.png"
[00:00:11]             │ info Current URL is: data:/,
[00:00:11]             │ info Saving page source to: /dev/shm/workspace/parallel/17/kibana/x-pack/test/functional/failure_debug/html/maps app _before all_ hook in _maps app_.html
[00:00:11]             └- ✖ fail: maps app "before all" hook in "maps app"
[00:00:11]             │      ResponseError: mapper_parsing_exception
[00:00:11]             │       at onBody (/dev/shm/workspace/parallel/17/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:337:23)
[00:00:11]             │       at IncomingMessage.onEnd (/dev/shm/workspace/parallel/17/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:264:11)
[00:00:11]             │       at endReadableNT (internal/streams/readable.js:1327:12)
[00:00:11]             │       at processTicksAndRejections (internal/process/task_queues.js:80:21)
[00:00:11]             │ 
[00:00:11]             │ 

Stack Trace

ResponseError: mapper_parsing_exception
    at onBody (/dev/shm/workspace/parallel/17/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:337:23)
    at IncomingMessage.onEnd (/dev/shm/workspace/parallel/17/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:264:11)
    at endReadableNT (internal/streams/readable.js:1327:12)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  meta: {
    body: { error: [Object], status: 400 },
    statusCode: 400,
    headers: {
      'x-elastic-product': 'Elasticsearch',
      'content-type': 'application/json;charset=utf-8',
      'content-length': '527'
    },
    meta: {
      context: null,
      request: [Object],
      name: 'elasticsearch-js',
      connection: [Object],
      attempts: 0,
      aborted: false
    }
  }
}

Kibana Pipeline / general / Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/discover/feature_controls/discover_security·ts.discover feature controls discover feature controls security "before all" hook in "discover feature controls security"

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:13:01]         └-: discover
[00:13:01]           └-> "before all" hook in "discover"
[00:13:01]           └-: feature controls
[00:13:01]             └-> "before all" hook in "feature controls"
[00:13:01]             └-: discover feature controls security
[00:13:01]               └-> "before all" hook in "discover feature controls security"
[00:13:01]               └-> "before all" hook in "discover feature controls security"
[00:13:01]                 │ info [discover/feature_controls/security] Loading "mappings.json"
[00:13:01]                 │ info [discover/feature_controls/security] Loading "data.json"
[00:13:01]                 │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_task_manager_8.0.0_001/qKWidMdATh2-VpgNh_NPpQ] deleting index
[00:13:01]                 │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_pre6.5.0_001/GzdxoYCfSjSIQaaHOGKs2A] deleting index
[00:13:01]                 │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_8.0.0_001/6znuzDjaS7-ldxzx-f4FFw] deleting index
[00:13:01]                 │ info [discover/feature_controls/security] Deleted existing index ".kibana_8.0.0_001"
[00:13:01]                 │ info [discover/feature_controls/security] Deleted existing index ".kibana_task_manager_8.0.0_001"
[00:13:01]                 │ info [discover/feature_controls/security] Deleted existing index ".kibana_pre6.5.0_001"
[00:13:01]                 │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] failed on parsing mappings on index creation [.kibana]
[00:13:01]                 │      org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: using deprecated parameters [tree] in mapper [bounds] of type [geo_shape] is no longer allowed
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:294) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.mergeAndApplyMappings(MapperService.java:272) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:256) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:983) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$2(MetadataCreateIndexService.java:408) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:628) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:406) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV1Templates(MetadataCreateIndexService.java:483) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:369) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:376) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:48) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:686) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:308) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:203) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:140) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:139) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:177) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:678) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:241) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:204) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
[00:13:01]                 │      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
[00:13:01]                 │      	at java.lang.Thread.run(Thread.java:831) [?:?]
[00:13:01]                 │      Caused by: java.lang.IllegalArgumentException: using deprecated parameters [tree] in mapper [bounds] of type [geo_shape] is no longer allowed
[00:13:01]                 │      	at org.elasticsearch.xpack.spatial.index.mapper.GeoShapeWithDocValuesFieldMapper.lambda$static$0(GeoShapeWithDocValuesFieldMapper.java:189) ~[?:?]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:318) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:237) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parse(ObjectMapper.java:207) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:318) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:237) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:153) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:98) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:92) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:292) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	... 23 more
[00:13:01]                 │ info Taking screenshot "/dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/screenshots/failure/discover feature controls discover feature controls security _before all_ hook in _discover feature controls security_.png"
[00:13:01]                 │ proc [kibana]   log   [19:07:14.945] [warning][environment] Detected an unhandled Promise rejection.
[00:13:01]                 │ proc [kibana] Error: Saved object [space/default] not found
[00:13:01]                 │ info Current URL is: http://localhost:61191/app/management/insightsAndAlerting/watcher/watches
[00:13:01]                 │ info Saving page source to: /dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/failure_debug/html/discover feature controls discover feature controls security _before all_ hook in _discover feature controls security_.html
[00:13:01]                 └- ✖ fail: discover feature controls discover feature controls security "before all" hook in "discover feature controls security"
[00:13:01]                 │      ResponseError: mapper_parsing_exception
[00:13:01]                 │       at onBody (/dev/shm/workspace/parallel/19/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:337:23)
[00:13:01]                 │       at IncomingMessage.onEnd (/dev/shm/workspace/parallel/19/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:264:11)
[00:13:01]                 │       at endReadableNT (internal/streams/readable.js:1327:12)
[00:13:01]                 │       at processTicksAndRejections (internal/process/task_queues.js:80:21)
[00:13:01]                 │ 
[00:13:01]                 │ 

Stack Trace

ResponseError: mapper_parsing_exception
    at onBody (/dev/shm/workspace/parallel/19/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:337:23)
    at IncomingMessage.onEnd (/dev/shm/workspace/parallel/19/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:264:11)
    at endReadableNT (internal/streams/readable.js:1327:12)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  meta: {
    body: { error: [Object], status: 400 },
    statusCode: 400,
    headers: {
      'x-elastic-product': 'Elasticsearch',
      'content-type': 'application/json;charset=utf-8',
      'content-length': '527'
    },
    meta: {
      context: null,
      request: [Object],
      name: 'elasticsearch-js',
      connection: [Object],
      attempts: 0,
      aborted: false
    }
  }
}

Kibana Pipeline / general / Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/discover/feature_controls/discover_security·ts.discover feature controls discover feature controls security "after all" hook in "discover feature controls security"

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:13:01]         └-: discover
[00:13:01]           └-> "before all" hook in "discover"
[00:13:01]           └-: feature controls
[00:13:01]             └-> "before all" hook in "feature controls"
[00:13:01]             └-: discover feature controls security
[00:13:01]               └-> "before all" hook in "discover feature controls security"
[00:13:01]               └-> "before all" hook in "discover feature controls security"
[00:13:01]                 │ info [discover/feature_controls/security] Loading "mappings.json"
[00:13:01]                 │ info [discover/feature_controls/security] Loading "data.json"
[00:13:01]                 │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_task_manager_8.0.0_001/qKWidMdATh2-VpgNh_NPpQ] deleting index
[00:13:01]                 │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_pre6.5.0_001/GzdxoYCfSjSIQaaHOGKs2A] deleting index
[00:13:01]                 │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] [.kibana_8.0.0_001/6znuzDjaS7-ldxzx-f4FFw] deleting index
[00:13:01]                 │ info [discover/feature_controls/security] Deleted existing index ".kibana_8.0.0_001"
[00:13:01]                 │ info [discover/feature_controls/security] Deleted existing index ".kibana_task_manager_8.0.0_001"
[00:13:01]                 │ info [discover/feature_controls/security] Deleted existing index ".kibana_pre6.5.0_001"
[00:13:01]                 │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1624472168585201576] failed on parsing mappings on index creation [.kibana]
[00:13:01]                 │      org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: using deprecated parameters [tree] in mapper [bounds] of type [geo_shape] is no longer allowed
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:294) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.mergeAndApplyMappings(MapperService.java:272) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:256) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:983) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$2(MetadataCreateIndexService.java:408) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:628) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:406) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV1Templates(MetadataCreateIndexService.java:483) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:369) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:376) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:48) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:686) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:308) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:203) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:140) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:139) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:177) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:678) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:241) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:204) [elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
[00:13:01]                 │      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
[00:13:01]                 │      	at java.lang.Thread.run(Thread.java:831) [?:?]
[00:13:01]                 │      Caused by: java.lang.IllegalArgumentException: using deprecated parameters [tree] in mapper [bounds] of type [geo_shape] is no longer allowed
[00:13:01]                 │      	at org.elasticsearch.xpack.spatial.index.mapper.GeoShapeWithDocValuesFieldMapper.lambda$static$0(GeoShapeWithDocValuesFieldMapper.java:189) ~[?:?]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:318) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:237) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parse(ObjectMapper.java:207) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:318) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:237) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:153) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:98) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:92) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	at org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:292) ~[elasticsearch-8.0.0-SNAPSHOT.jar:8.0.0-SNAPSHOT]
[00:13:01]                 │      	... 23 more
[00:13:01]                 │ info Taking screenshot "/dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/screenshots/failure/discover feature controls discover feature controls security _before all_ hook in _discover feature controls security_.png"
[00:13:01]                 │ proc [kibana]   log   [19:07:14.945] [warning][environment] Detected an unhandled Promise rejection.
[00:13:01]                 │ proc [kibana] Error: Saved object [space/default] not found
[00:13:01]                 │ info Current URL is: http://localhost:61191/app/management/insightsAndAlerting/watcher/watches
[00:13:01]                 │ info Saving page source to: /dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/failure_debug/html/discover feature controls discover feature controls security _before all_ hook in _discover feature controls security_.html
[00:13:01]                 └- ✖ fail: discover feature controls discover feature controls security "before all" hook in "discover feature controls security"
[00:13:01]                 │      ResponseError: mapper_parsing_exception
[00:13:01]                 │       at onBody (/dev/shm/workspace/parallel/19/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:337:23)
[00:13:01]                 │       at IncomingMessage.onEnd (/dev/shm/workspace/parallel/19/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:264:11)
[00:13:01]                 │       at endReadableNT (internal/streams/readable.js:1327:12)
[00:13:01]                 │       at processTicksAndRejections (internal/process/task_queues.js:80:21)
[00:13:01]                 │ 
[00:13:01]                 │ 
[00:13:01]                 └-> "after all" hook in "discover feature controls security"
[00:13:01]                   │ info [discover/feature_controls/security] Unloading indices from "mappings.json"
[00:13:01]                   │ warn since spaces are enabled, all objects other than the default space were deleted from .kibana rather than deleting the whole index
[00:13:01]                   │ info [discover/feature_controls/security] Deleted existing index ".kibana"
[00:13:01]                   │ info [discover/feature_controls/security] Unloading indices from "data.json"
[00:13:01]                   │ debg SecurityPage.forceLogout
[00:13:01]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=100
[00:13:02]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:02]                   │ debg Redirecting to /logout to force the logout
[00:13:02]                   │ debg Waiting on the login form to appear
[00:13:02]                   │ debg Waiting for Login Page to appear.
[00:13:02]                   │ debg Waiting up to 100000ms for login page...
[00:13:02]                   │ debg browser[INFO] http://localhost:61191/logout?_t=1624475235716 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:13:02]                   │
[00:13:02]                   │ debg browser[INFO] http://localhost:61191/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:13:03]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:03]                   │ proc [kibana]   log   [19:07:16.376] [error][http] Error: Saved object [space/default] not found
[00:13:03]                   │ proc [kibana]     at Function.createGenericNotFoundError (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/saved_objects/service/lib/errors.js:125:37)
[00:13:03]                   │ proc [kibana]     at SavedObjectsRepository.get (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/saved_objects/service/lib/repository.js:929:46)
[00:13:03]                   │ proc [kibana]     at runMicrotasks (<anonymous>)
[00:13:03]                   │ proc [kibana]     at processTicksAndRejections (internal/process/task_queues.js:93:5)
[00:13:03]                   │ proc [kibana]     at SpacesClient.get (/dev/shm/workspace/kibana-build-xpack-19/x-pack/plugins/spaces/server/spaces_client/spaces_client.js:60:25)
[00:13:03]                   │ proc [kibana]     at checkAccess (/dev/shm/workspace/kibana-build-xpack-19/x-pack/plugins/enterprise_search/server/lib/check_access.js:56:21)
[00:13:03]                   │ proc [kibana]     at /dev/shm/workspace/kibana-build-xpack-19/x-pack/plugins/enterprise_search/server/plugin.js:104:11
[00:13:03]                   │ proc [kibana]     at /dev/shm/workspace/kibana-build-xpack-19/src/core/server/capabilities/resolve_capabilities.js:31:21
[00:13:03]                   │ proc [kibana]     at /dev/shm/workspace/kibana-build-xpack-19/src/core/server/capabilities/resolve_capabilities.js:30:26
[00:13:03]                   │ proc [kibana]     at /dev/shm/workspace/kibana-build-xpack-19/src/core/server/capabilities/routes/resolve_capabilities.js:40:26
[00:13:03]                   │ proc [kibana]     at Router.handle (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/http/router/router.js:163:30)
[00:13:03]                   │ proc [kibana]     at handler (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/http/router/router.js:124:50)
[00:13:03]                   │ proc [kibana]     at exports.Manager.execute (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/toolkit.js:60:28)
[00:13:03]                   │ proc [kibana]     at Object.internals.handler (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/handler.js:46:20)
[00:13:03]                   │ proc [kibana]     at exports.execute (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/handler.js:31:20)
[00:13:03]                   │ proc [kibana]     at Request._lifecycle (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/request.js:370:32) {
[00:13:03]                   │ proc [kibana]   data: null,
[00:13:03]                   │ proc [kibana]   isBoom: true,
[00:13:03]                   │ proc [kibana]   isServer: false,
[00:13:03]                   │ proc [kibana]   output: {
[00:13:03]                   │ proc [kibana]     statusCode: 404,
[00:13:03]                   │ proc [kibana]     payload: {
[00:13:03]                   │ proc [kibana]       statusCode: 404,
[00:13:03]                   │ proc [kibana]       error: 'Not Found',
[00:13:03]                   │ proc [kibana]       message: 'Saved object [space/default] not found'
[00:13:03]                   │ proc [kibana]     },
[00:13:03]                   │ proc [kibana]     headers: {}
[00:13:03]                   │ proc [kibana]   },
[00:13:03]                   │ proc [kibana]   [Symbol(SavedObjectsClientErrorCode)]: 'SavedObjectsClient/notFound'
[00:13:03]                   │ proc [kibana] }
[00:13:03]                   │ proc [kibana]  error  [19:07:16.298]  Error: Internal Server Error
[00:13:03]                   │ proc [kibana]     at HapiResponseAdapter.toInternalError (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/http/router/response_adapter.js:61:19)
[00:13:03]                   │ proc [kibana]     at Router.handle (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/http/router/router.js:177:34)
[00:13:03]                   │ proc [kibana]     at runMicrotasks (<anonymous>)
[00:13:03]                   │ proc [kibana]     at processTicksAndRejections (internal/process/task_queues.js:93:5)
[00:13:03]                   │ proc [kibana]     at handler (/dev/shm/workspace/kibana-build-xpack-19/src/core/server/http/router/router.js:124:50)
[00:13:03]                   │ proc [kibana]     at exports.Manager.execute (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/toolkit.js:60:28)
[00:13:03]                   │ proc [kibana]     at Object.internals.handler (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/handler.js:46:20)
[00:13:03]                   │ proc [kibana]     at exports.execute (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/handler.js:31:20)
[00:13:03]                   │ proc [kibana]     at Request._lifecycle (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/request.js:370:32)
[00:13:03]                   │ proc [kibana]     at Request._execute (/dev/shm/workspace/kibana-build-xpack-19/node_modules/@hapi/hapi/lib/request.js:279:9)
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/internal/spaces/_active_space - Failed to load resource: the server responded with a status of 404 (Not Found)
[00:13:05]                   │ debg browser[INFO] http://localhost:61191/41734/bundles/core/core.entry.js 12:151032 "Detected an unhandled Promise rejection.
[00:13:05]                   │      Error: Not Found"
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/41734/bundles/core/core.entry.js 5:2514 
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/api/core/capabilities?useDefaultCapabilities=true - Failed to load resource: the server responded with a status of 500 (Internal Server Error)
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/41734/bundles/core/core.entry.js 12:150104 Error: Internal Server Error
[00:13:05]                   │          at fetch_Fetch.fetchResponse (http://localhost:61191/41734/bundles/core/core.entry.js:6:26763)
[00:13:05]                   │          at async http://localhost:61191/41734/bundles/core/core.entry.js:6:24090
[00:13:05]                   │          at async http://localhost:61191/41734/bundles/core/core.entry.js:6:23996
[00:13:05]                   │ debg browser[INFO] http://localhost:61191/41734/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.@elastic.js 129:5498 "Version 9 of Highlight.js has reached EOL and is no longer supported.
[00:13:05]                   │      Please upgrade or ask whatever dependency you are using to upgrade.
[00:13:05]                   │      https://github.com/highlightjs/highlight.js/issues/2877"
[00:13:05]                   │ debg browser[INFO] http://localhost:61191/41734/bundles/core/core.entry.js 12:151032 "Detected an unhandled Promise rejection.
[00:13:05]                   │      Error: Internal Server Error"
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/41734/bundles/core/core.entry.js 5:2514 
[00:13:05]                   │ debg browser[INFO] http://localhost:61191/41734/bundles/core/core.entry.js 12:151032 "Detected an unhandled Promise rejection.
[00:13:05]                   │      EmptyError: no elements in sequence"
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/logout?_t=1624475235716 0:0 Uncaught e: no elements in sequence
[00:13:05]                   │ERROR browser[SEVERE] http://localhost:61191/41734/bundles/kbn-ui-shared-deps/kbn-ui-shared-deps.js 297:178122 Uncaught e: no elements in sequence
[00:13:05]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:06]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:09]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:10]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:12]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:13]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:16]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:17]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:19]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:20]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:23]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:24]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:26]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:28]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:30]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:31]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:34]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:35]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:37]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:38]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:41]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:42]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:44]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:45]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:48]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:49]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:51]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:52]                   │ proc [kibana]   log   [19:08:05.696] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:13:52]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:55]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:56]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:13:58]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:13:59]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:02]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:03]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:05]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:06]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:09]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:10]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:13]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:14]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:16]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:17]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:20]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:21]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:23]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:24]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:27]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:28]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:30]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:31]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:34]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:35]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:37]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:38]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:41]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:42]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:44]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:45]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:48]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:49]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:52]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:52]                   │ proc [kibana]   log   [19:09:05.701] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:14:53]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:55]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:14:56]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:14:59]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:00]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:02]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:03]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:06]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:07]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:09]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:10]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:13]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:14]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:16]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:17]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:20]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:21]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:23]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:24]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:27]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:28]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:30]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:31]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:34]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:35]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:38]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:39]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:41]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:42]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:45]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:46]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:48]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:49]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:52]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:52]                   │ proc [kibana]   log   [19:10:05.705] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:15:53]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:55]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:15:56]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:15:59]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:00]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:02]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:03]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:06]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:07]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:09]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:10]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:13]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:14]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:16]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:17]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:20]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:21]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:23]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:25]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:27]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:28]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:31]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:32]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:34]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:35]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:38]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:39]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:41]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:42]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:45]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:46]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:48]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:49]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:52]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:52]                   │ proc [kibana]   log   [19:11:05.710] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:16:53]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:55]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:16:56]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:16:59]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:00]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:03]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:04]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:06]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:07]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:10]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:11]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:13]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:14]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:17]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:18]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:20]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:21]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:24]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:25]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:27]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:28]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:31]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:32]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:34]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:35]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:38]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:39]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:41]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:42]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:45]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:46]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:49]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:50]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:52]                   │ proc [kibana]   log   [19:12:05.713] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:17:52]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:53]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:56]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:17:57]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:17:59]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:00]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:03]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:04]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:06]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:07]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:10]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:11]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:13]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:14]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:17]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:18]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:20]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:21]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:24]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:25]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:27]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:28]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:31]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:32]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:35]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:36]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:38]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:39]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:42]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:43]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:45]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:46]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:49]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:50]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:52]                   │ proc [kibana]   log   [19:13:05.717] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:18:52]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:53]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:56]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:18:57]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:18:59]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:19:00]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:19:01]                   └- ✖ fail: discover feature controls discover feature controls security "after all" hook in "discover feature controls security"
[00:19:01]                   │      Error: Timeout of 360000ms exceeded. For async tests and hooks, ensure "done()" is called; if returning a Promise, ensure it resolves. (/dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/apps/discover/feature_controls/discover_security.ts)
[00:19:01]                   │       at listOnTimeout (internal/timers.js:554:17)
[00:19:01]                   │       at processTimers (internal/timers.js:497:7)
[00:19:01]                   │ 
[00:19:01]                   │ 

Stack Trace

Error: Timeout of 360000ms exceeded. For async tests and hooks, ensure "done()" is called; if returning a Promise, ensure it resolves. (/dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/apps/discover/feature_controls/discover_security.ts)
    at listOnTimeout (internal/timers.js:554:17)
    at processTimers (internal/timers.js:497:7)

and 16 more failures, only showing the first 3.

Metrics [docs]

‼️ ERROR: no builds found for mergeBase sha [3b3ad0e]

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@cee-chen
Copy link
Member Author

lmfao what???

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-backport Deprecated - use backport:version if exact versions are needed Feature:Plugins release_note:skip Skip the PR/issue when compiling release notes v7.13.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants