Skip to content

Commit

Permalink
Merge remote-tracking branch 'es/6.x' into ccr-6.x
Browse files Browse the repository at this point in the history
* es/6.x:
  [Docs] Fix explanation for `from` and `size` example (#28320)
  Adapt bwc version after backport #28358
  Always return the after_key in composite aggregation response (#28358)
  Adds a note in the `terms` aggregation docs regarding pagination (#28360)
  Update packaging tests to work with meta plugins (#28336)
  Remove Painless Type from MethodWriter in favor of Java Class. (#28346)
  [Doc] Fixs typo in reverse-nested-aggregation.asciidoc (#28348)
  [Docs] Fixed Indices information breaking changes (#27914)
  Reindex: Shore up rethrottle test
  isHeldByCurrentThread should return primitive bool
  [Docs] Clarify `html` encoder in highlighting.asciidoc (#27766)
  Only assert single commit iff index created on 6.2
  Deprecate the `update_all_types` option. (#28284)
  Fix GeoDistance query example
  Settings: Introduce settings updater for a list of settings (#28338)
  [Docs] Fix asciidoc style in composite agg docs
  Adapt bwc version after backport #28310
  Adds the ability to specify a format on composite date_histogram source (#28310)
  • Loading branch information
martijnvg committed Jan 25, 2018
2 parents 1351efd + 21cb09b commit 40b57c7
Show file tree
Hide file tree
Showing 42 changed files with 881 additions and 179 deletions.
2 changes: 1 addition & 1 deletion docs/java-api/query-dsl/geo-distance-query.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ See {ref}/query-dsl-geo-distance-query.html[Geo Distance Query]

["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{query-dsl-test}[geo_bounding_box]
include-tagged::{query-dsl-test}[geo_distance]
--------------------------------------------------
<1> field
<2> center point
Expand Down
53 changes: 49 additions & 4 deletions docs/reference/aggregations/bucket/composite-aggregation.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -224,8 +224,40 @@ Time values can also be specified via abbreviations supported by <<time-units,ti
Note that fractional time values are not supported, but you can address this by shifting to another
time unit (e.g., `1.5h` could instead be specified as `90m`).

[float]
===== Time Zone
====== Format

Internally, a date is represented as a 64 bit number representing a timestamp in milliseconds-since-the-epoch.
These timestamps are returned as the bucket keys. It is possible to return a formatted date string instead using
the format specified with the format parameter:

[source,js]
--------------------------------------------------
GET /_search
{
"aggs" : {
"my_buckets": {
"composite" : {
"sources" : [
{
"date": {
"date_histogram" : {
"field": "timestamp",
"interval": "1d",
"format": "yyyy-MM-dd" <1>
}
}
}
]
}
}
}
}
--------------------------------------------------
// CONSOLE

<1> Supports expressive date <<date-format-pattern,format pattern>>

====== Time Zone

Date-times are stored in Elasticsearch in UTC. By default, all bucketing and
rounding is also done in UTC. The `time_zone` parameter can be used to indicate
Expand Down Expand Up @@ -362,6 +394,10 @@ GET /_search
...
"aggregations": {
"my_buckets": {
"after_key": { <1>
"date": 1494288000000,
"product": "mad max"
},
"buckets": [
{
"key": {
Expand All @@ -371,7 +407,7 @@ GET /_search
"doc_count": 1
},
{
"key": { <1>
"key": {
"date": 1494288000000,
"product": "mad max"
},
Expand All @@ -386,9 +422,14 @@ GET /_search

<1> The last composite bucket returned by the query.

NOTE: The `after_key` is equals to the last bucket returned in the response before
any filtering that could be done by <<search-aggregations-pipeline, Pipeline aggregations>>.
If all buckets are filtered/removed by a pipeline aggregation, the `after_key` will contain
the last bucket before filtering.

The `after` parameter can be used to retrieve the composite buckets that are **after**
the last composite buckets returned in a previous round.
For the example below the last bucket is `"key": [1494288000000, "mad max"]` so the next
For the example below the last bucket can be found in `after_key` and the next
round of result can be retrieved with:

[source,js]
Expand Down Expand Up @@ -453,6 +494,10 @@ GET /_search
...
"aggregations": {
"my_buckets": {
"after_key": {
"date": 1494201600000,
"product": "rocky"
},
"buckets": [
{
"key": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ GET /issues/_search
// TEST[s/_search/_search\?filter_path=aggregations/]

As you can see above, the `reverse_nested` aggregation is put in to a `nested` aggregation as this is the only place
in the dsl where the `reversed_nested` aggregation can be used. Its sole purpose is to join back to a parent doc higher
in the dsl where the `reverse_nested` aggregation can be used. Its sole purpose is to join back to a parent doc higher
up in the nested structure.

<1> A `reverse_nested` aggregation that joins back to the root / main document level, because no `path` has been defined.
Expand Down
5 changes: 5 additions & 0 deletions docs/reference/aggregations/bucket/terms-aggregation.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -114,6 +114,11 @@ This means that if the number of unique terms is greater than `size`, the return
(it could be that the term counts are slightly off and it could even be that a term that should have been in the top
size buckets was not returned).

NOTE: If you want to retrieve **all** terms or all combinations of terms in a nested `terms` aggregation
you should use the <<search-aggregations-bucket-composite-aggregation,Composite>> aggregation which
allows to paginate over all possible terms rather than setting a size greater than the cardinality of the field in the
`terms` aggregation. The `terms` aggregation is meant to return the `top` terms and does not allow pagination.

[[search-aggregations-bucket-terms-aggregation-approximate-counts]]
==== Document counts are approximate

Expand Down
2 changes: 1 addition & 1 deletion docs/reference/getting-started.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -858,7 +858,7 @@ GET /bank/_search

Note that if `size` is not specified, it defaults to 10.

This example does a `match_all` and returns documents 11 through 20:
This example does a `match_all` and returns documents 10 through 19:

[source,js]
--------------------------------------------------
Expand Down
8 changes: 5 additions & 3 deletions docs/reference/migration/migrate_6_0/rest.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -66,9 +66,11 @@ Previously it was possible to execute `GET /_aliases,_mappings` or `GET
/myindex/_settings,_alias` by separating multiple types of requests with commas
in order to retrieve multiple types of information about one or more indices.
This comma-separation for retrieving multiple pieces of information has been
removed.. `GET /_all` can be used to retrieve all aliases, settings, and
mappings for all indices. In order to retrieve only the mappings for an index,
`GET /myindex/_mappings` (or `_aliases`, or `_settings`).
removed. `GET /_all` can be used to retrieve all aliases, settings, and
mappings for all indices.

In order to retrieve only the mapping for an index use:
`GET /myindex/_mapping` (or `_alias` for a list of aliases, or `_settings` for the settings).

==== Requests to existing endpoints with incorrect HTTP verb now return 405 responses

Expand Down
5 changes: 3 additions & 2 deletions docs/reference/search/request/highlighting.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -145,8 +145,9 @@ You can specify the locale to use with `boundary_scanner_locale`.
boundary_scanner_locale:: Controls which locale is used to search for sentence
and word boundaries.

encoder:: Indicates if the highlighted text should be HTML encoded:
`default` (no encoding) or `html` (escapes HTML highlighting tags).
encoder:: Indicates if the snippet should be HTML encoded:
`default` (no encoding) or `html` (HTML-escape the snippet text and then
insert the highlighting tags)

fields:: Specifies the fields to retrieve highlights for. You can use wildcards
to specify fields. For example, you could specify `comment_*` to
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ public final class MethodWriter extends GeneratorAdapter {
private final BitSet statements;
private final CompilerSettings settings;

private final Deque<List<org.objectweb.asm.Type>> stringConcatArgs =
private final Deque<List<Type>> stringConcatArgs =
(INDY_STRING_CONCAT_BOOTSTRAP_HANDLE == null) ? null : new ArrayDeque<>();

public MethodWriter(int access, Method method, ClassVisitor cw, BitSet statements, CompilerSettings settings) {
Expand Down Expand Up @@ -200,7 +200,7 @@ private void writeCast(Class<?> from, Class<?> to) {
* Proxy the box method to use valueOf instead to ensure that the modern boxing methods are used.
*/
@Override
public void box(org.objectweb.asm.Type type) {
public void box(Type type) {
valueOf(type);
}

Expand Down Expand Up @@ -252,10 +252,10 @@ public int writeNewStrings() {
}
}

public void writeAppendStrings(final Definition.Type type) {
public void writeAppendStrings(Class<?> clazz) {
if (INDY_STRING_CONCAT_BOOTSTRAP_HANDLE != null) {
// Java 9+: record type information
stringConcatArgs.peek().add(type.type);
stringConcatArgs.peek().add(getType(clazz));
// prevent too many concat args.
// If there are too many, do the actual concat:
if (stringConcatArgs.peek().size() >= MAX_INDY_STRING_CONCAT_ARGS) {
Expand All @@ -266,24 +266,24 @@ public void writeAppendStrings(final Definition.Type type) {
}
} else {
// Java 8: push a StringBuilder append
if (type.clazz == boolean.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_BOOLEAN);
else if (type.clazz == char.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_CHAR);
else if (type.clazz == byte.class ||
type.clazz == short.class ||
type.clazz == int.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_INT);
else if (type.clazz == long.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_LONG);
else if (type.clazz == float.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_FLOAT);
else if (type.clazz == double.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_DOUBLE);
else if (type.clazz == String.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_STRING);
else invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_OBJECT);
if (clazz == boolean.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_BOOLEAN);
else if (clazz == char.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_CHAR);
else if (clazz == byte.class ||
clazz == short.class ||
clazz == int.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_INT);
else if (clazz == long.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_LONG);
else if (clazz == float.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_FLOAT);
else if (clazz == double.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_DOUBLE);
else if (clazz == String.class) invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_STRING);
else invokeVirtual(STRINGBUILDER_TYPE, STRINGBUILDER_APPEND_OBJECT);
}
}

public void writeToStrings() {
if (INDY_STRING_CONCAT_BOOTSTRAP_HANDLE != null) {
// Java 9+: use type information and push invokeDynamic
final String desc = org.objectweb.asm.Type.getMethodDescriptor(STRING_TYPE,
stringConcatArgs.pop().stream().toArray(org.objectweb.asm.Type[]::new));
final String desc = Type.getMethodDescriptor(STRING_TYPE,
stringConcatArgs.pop().stream().toArray(Type[]::new));
invokeDynamic("concat", desc, INDY_STRING_CONCAT_BOOTSTRAP_HANDLE);
} else {
// Java 8: call toString() on StringBuilder
Expand All @@ -292,9 +292,9 @@ public void writeToStrings() {
}

/** Writes a dynamic binary instruction: returnType, lhs, and rhs can be different */
public void writeDynamicBinaryInstruction(Location location, Definition.Type returnType, Definition.Type lhs, Definition.Type rhs,
public void writeDynamicBinaryInstruction(Location location, Class<?> returnType, Class<?> lhs, Class<?> rhs,
Operation operation, int flags) {
org.objectweb.asm.Type methodType = org.objectweb.asm.Type.getMethodType(returnType.type, lhs.type, rhs.type);
Type methodType = Type.getMethodType(getType(returnType), getType(lhs), getType(rhs));

switch (operation) {
case MUL:
Expand All @@ -310,7 +310,7 @@ public void writeDynamicBinaryInstruction(Location location, Definition.Type ret
// if either side is primitive, then the + operator should always throw NPE on null,
// so we don't need a special NPE guard.
// otherwise, we need to allow nulls for possible string concatenation.
boolean hasPrimitiveArg = lhs.clazz.isPrimitive() || rhs.clazz.isPrimitive();
boolean hasPrimitiveArg = lhs.isPrimitive() || rhs.isPrimitive();
if (!hasPrimitiveArg) {
flags |= DefBootstrap.OPERATOR_ALLOWS_NULL;
}
Expand Down Expand Up @@ -343,26 +343,26 @@ public void writeDynamicBinaryInstruction(Location location, Definition.Type ret
}

/** Writes a static binary instruction */
public void writeBinaryInstruction(Location location, Definition.Type type, Operation operation) {
if ((type.clazz == float.class || type.clazz == double.class) &&
public void writeBinaryInstruction(Location location, Class<?> clazz, Operation operation) {
if ( (clazz == float.class || clazz == double.class) &&
(operation == Operation.LSH || operation == Operation.USH ||
operation == Operation.RSH || operation == Operation.BWAND ||
operation == Operation.XOR || operation == Operation.BWOR)) {
throw location.createError(new IllegalStateException("Illegal tree structure."));
}

switch (operation) {
case MUL: math(GeneratorAdapter.MUL, type.type); break;
case DIV: math(GeneratorAdapter.DIV, type.type); break;
case REM: math(GeneratorAdapter.REM, type.type); break;
case ADD: math(GeneratorAdapter.ADD, type.type); break;
case SUB: math(GeneratorAdapter.SUB, type.type); break;
case LSH: math(GeneratorAdapter.SHL, type.type); break;
case USH: math(GeneratorAdapter.USHR, type.type); break;
case RSH: math(GeneratorAdapter.SHR, type.type); break;
case BWAND: math(GeneratorAdapter.AND, type.type); break;
case XOR: math(GeneratorAdapter.XOR, type.type); break;
case BWOR: math(GeneratorAdapter.OR, type.type); break;
case MUL: math(GeneratorAdapter.MUL, getType(clazz)); break;
case DIV: math(GeneratorAdapter.DIV, getType(clazz)); break;
case REM: math(GeneratorAdapter.REM, getType(clazz)); break;
case ADD: math(GeneratorAdapter.ADD, getType(clazz)); break;
case SUB: math(GeneratorAdapter.SUB, getType(clazz)); break;
case LSH: math(GeneratorAdapter.SHL, getType(clazz)); break;
case USH: math(GeneratorAdapter.USHR, getType(clazz)); break;
case RSH: math(GeneratorAdapter.SHR, getType(clazz)); break;
case BWAND: math(GeneratorAdapter.AND, getType(clazz)); break;
case XOR: math(GeneratorAdapter.XOR, getType(clazz)); break;
case BWOR: math(GeneratorAdapter.OR, getType(clazz)); break;
default:
throw location.createError(new IllegalStateException("Illegal tree structure."));
}
Expand Down Expand Up @@ -416,7 +416,7 @@ public void visitEnd() {
* @param flavor type of call
* @param params flavor-specific parameters
*/
public void invokeDefCall(String name, org.objectweb.asm.Type methodType, int flavor, Object... params) {
public void invokeDefCall(String name, Type methodType, int flavor, Object... params) {
Object[] args = new Object[params.length + 2];
args[0] = settings.getInitialCallSiteDepth();
args[1] = flavor;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
import org.elasticsearch.painless.Definition;
import org.elasticsearch.painless.Definition.Cast;
import org.elasticsearch.painless.Definition.Type;
import org.elasticsearch.painless.Definition.def;
import org.elasticsearch.painless.Globals;
import org.elasticsearch.painless.Locals;
import org.elasticsearch.painless.Location;
Expand Down Expand Up @@ -274,12 +275,12 @@ void write(MethodWriter writer, Globals globals) {
writer.writeDup(lhs.accessElementCount(), catElementStackSize); // dup the top element and insert it
// before concat helper on stack
lhs.load(writer, globals); // read the current lhs's value
writer.writeAppendStrings(lhs.actual); // append the lhs's value using the StringBuilder
writer.writeAppendStrings(Definition.TypeToClass(lhs.actual)); // append the lhs's value using the StringBuilder

rhs.write(writer, globals); // write the bytecode for the rhs

if (!(rhs instanceof EBinary) || !((EBinary)rhs).cat) { // check to see if the rhs has already done a concatenation
writer.writeAppendStrings(rhs.actual); // append the rhs's value since it's hasn't already
if (!(rhs instanceof EBinary) || !((EBinary)rhs).cat) { // check to see if the rhs has already done a concatenation
writer.writeAppendStrings(Definition.TypeToClass(rhs.actual)); // append the rhs's value since it's hasn't already
}

writer.writeToStrings(); // put the value for string concat onto the stack
Expand Down Expand Up @@ -313,9 +314,9 @@ void write(MethodWriter writer, Globals globals) {
// write the operation instruction for compound assignment
if (promote.dynamic) {
writer.writeDynamicBinaryInstruction(
location, promote, DefType, DefType, operation, DefBootstrap.OPERATOR_COMPOUND_ASSIGNMENT);
location, Definition.TypeToClass(promote), def.class, def.class, operation, DefBootstrap.OPERATOR_COMPOUND_ASSIGNMENT);
} else {
writer.writeBinaryInstruction(location, promote, operation);
writer.writeBinaryInstruction(location, Definition.TypeToClass(promote), operation);
}

writer.writeCast(back); // if necessary cast the promotion type value back to the lhs's type
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -649,13 +649,13 @@ void write(MethodWriter writer, Globals globals) {
left.write(writer, globals);

if (!(left instanceof EBinary) || !((EBinary)left).cat) {
writer.writeAppendStrings(left.actual);
writer.writeAppendStrings(Definition.TypeToClass(left.actual));
}

right.write(writer, globals);

if (!(right instanceof EBinary) || !((EBinary)right).cat) {
writer.writeAppendStrings(right.actual);
writer.writeAppendStrings(Definition.TypeToClass(right.actual));
}

if (!cat) {
Expand Down Expand Up @@ -684,9 +684,10 @@ void write(MethodWriter writer, Globals globals) {
if (originallyExplicit) {
flags |= DefBootstrap.OPERATOR_EXPLICIT_CAST;
}
writer.writeDynamicBinaryInstruction(location, actual, left.actual, right.actual, operation, flags);
writer.writeDynamicBinaryInstruction(location, Definition.TypeToClass(actual),
Definition.TypeToClass(left.actual), Definition.TypeToClass(right.actual), operation, flags);
} else {
writer.writeBinaryInstruction(location, actual, operation);
writer.writeBinaryInstruction(location, Definition.TypeToClass(actual), operation);
}
}
}
Expand Down
Loading

0 comments on commit 40b57c7

Please sign in to comment.