elasticsearch

2025-12-10 0 965

Elasticsearch

Elasticsearch is a distributed search and analytics engine, scalable data store and vector database optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic’s open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more.

Use cases enabled by Elasticsearch include:

  • Retrieval Augmented Generation (RAG)

  • Vector search

  • Full-text search

  • Logs

  • Metrics

  • Application performance monitoring (APM)

  • Security logs

… and more!

To learn more about Elasticsearch’s features and capabilities, see our
product page.

To access information on machine learning innovations and the latest Lucene contributions from Elastic, more information can be found in Search Labs.

Get started

The simplest way to set up Elasticsearch is to create a managed deployment with
Elasticsearch Service on Elastic
Cloud.

If you prefer to install and manage Elasticsearch yourself, you can download
the latest version from
elastic.co/downloads/elasticsearch.

Run Elasticsearch locally

Warning

DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.

This setup is intended for local development and testing only.

Quickly set up Elasticsearch and Kibana in Docker for local development or testing, using the start-local script.

For more detailed information about the start-local setup, refer to the README on GitHub.

Prerequisites

  • If you don’t have Docker installed, download and install Docker Desktop for your operating system.

  • If you’re using Microsoft Windows, then install Windows Subsystem for Linux (WSL).

Trial license

This setup comes with a one-month trial license that includes all Elastic features.

After the trial period, the license reverts to Free and open – Basic.
Refer to Elastic subscriptions for more information.

Run start-local

To set up Elasticsearch and Kibana locally, run the start-local script:

curl -fsSL https://*el*astic.c*o/start-local | sh

This script creates an elastic-start-local folder containing configuration files and starts both Elasticsearch and Kibana using Docker.

After running the script, you can access Elastic services at the following endpoints:

  • Elasticsearch: http://lo**calh*ost:9200

  • Kibana: http://lo**c*alhost:5601

The script generates a random password for the elastic user, which is displayed at the end of the installation and stored in the .env file.

Caution

This setup is for local testing only. HTTPS is disabled, and Basic authentication is used for Elasticsearch. For security, Elasticsearch and Kibana are accessible only through localhost.

API access

An API key for Elasticsearch is generated and stored in the .env file as ES_LOCAL_API_KEY.
Use this key to connect to Elasticsearch with a programming language client or the REST API.

From the elastic-start-local folder, check the connection to Elasticsearch using curl:

source .env
curl $ES_LOCAL_URL -H \"Authorization: ApiKey ${ES_LOCAL_API_KEY}\"

To use the password for the elastic user, set and export the ES_LOCAL_PASSWORD environment variable. For example:

source .env
export ES_LOCAL_PASSWORD

Send requests to Elasticsearch

You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the Elasticsearch
language clients and curl.

Using curl

Here’s an example curl command to create a new Elasticsearch index, using basic auth:

curl -u elastic:$ES_LOCAL_PASSWORD \\
  -X PUT \\
  http://lo**calh*ost:9200/my-new-index \\
  -H \'Content-Type: application/json\'

Using a language client

To connect to your local dev Elasticsearch cluster with a language client, you can use basic authentication with the elastic username and the password stored in the ES_LOCAL_PASSWORD environment variable.

You’ll use the following connection details:

  • Elasticsearch endpoint: http://lo**calh*ost:9200

  • Username: elastic

  • Password: $ES_LOCAL_PASSWORD (Value you set in the environment variable)

For example, to connect with the Python elasticsearch client:

import os
from elasticsearch import Elasticsearch

username = \'elastic\'
password = os.getenv(\'ES_LOCAL_PASSWORD\') # Value you set in the environment variable

client = Elasticsearch(
    \"http://lo**calh*ost:9200\",
    basic_auth=(username, password)
)

print(client.info())

Using the Dev Tools Console

Kibana’s developer console provides an easy way to experiment and test requests.
To access the console, open Kibana, then go to Management > Dev Tools.

Add data

You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.

For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.

To add a single document to an index, submit an HTTP post request that targets the index.

POST /customer/_doc/1
{
  \"firstname\": \"Jennifer\",
  \"lastname\": \"Walters\"
}

This request automatically creates the customer index if it doesn’t exist,
adds a new document that has an ID of 1, and
stores and indexes the firstname and lastname fields.

The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:

GET /customer/_doc/1

To add multiple documents in one request, use the _bulk API.
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (\\n), including the last line.

PUT customer/_bulk
{ \"create\": { } }
{ \"firstname\": \"Monica\",\"lastname\":\"Rambeau\"}
{ \"create\": { } }
{ \"firstname\": \"Carol\",\"lastname\":\"Danvers\"}
{ \"create\": { } }
{ \"firstname\": \"Wanda\",\"lastname\":\"Maximoff\"}
{ \"create\": { } }
{ \"firstname\": \"Jennifer\",\"lastname\":\"Takeda\"}

Search

Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of Jennifer
in the customer index.

GET customer/_search
{
  \"query\" : {
    \"match\" : { \"firstname\": \"Jennifer\" }
  }
}

Explore

You can use Discover in Kibana to interactively search and filter your data.
From there, you can start creating visualizations and building and sharing dashboards.

To get started, create a data view that connects to one or more Elasticsearch indices,
data streams, or index aliases.

  1. Go to Management > Stack Management > Kibana > Data Views.

  2. Select Create data view.

  3. Enter a name for the data view and a pattern that matches one or more indices,
    such as customer.

  4. Select Save data view to Kibana.

To start exploring, go to Analytics > Discover.

Upgrade

To upgrade from an earlier version of Elasticsearch, see the
Elasticsearch upgrade
documentation.

Build from source

Elasticsearch uses Gradle for its build system.

To build a distribution for your local OS and print its output location upon
completion, run:

./gradlew localDistro

To build a distribution for another platform, run the related command:

./gradlew :distribution:archives:linux-tar:assemble
./gradlew :distribution:archives:darwin-tar:assemble
./gradlew :distribution:archives:windows-zip:assemble

Distributions are output to distribution/archives.

To run the test suite, see TESTING.

Documentation

For the complete Elasticsearch documentation visit
elastic.co.

For information about our documentation processes, see the
docs README.

Examples and guides

The elasticsearch-labs repo contains executable Python notebooks, sample apps, and resources to test out Elasticsearch for vector search, hybrid search and generative AI use cases.

Contribute

For contribution guidelines, see CONTRIBUTING.

Questions? Problems? Suggestions?

  • To report a bug or request a feature, create a
    GitHub Issue. Please
    ensure someone else hasn’t created an issue for the same topic.

  • Need help using Elasticsearch? Reach out on the
    Elastic Forum or Slack. A
    fellow community member or Elastic engineer will be happy to help you out.

下载源码

通过命令行克隆项目:

git clone https://github.com/elastic/elasticsearch.git

收藏 (0) 打赏

感谢您的支持,我会继续努力的!

打开微信/支付宝扫一扫,即可进行扫码打赏哦,分享从这里开始,精彩与您同在
点赞 (0)

申明:本文由第三方发布,内容仅代表作者观点,与本网站无关。对本文以及其中全部或者部分内容的真实性、完整性、及时性本站不作任何保证或承诺,请读者仅作参考,并请自行核实相关内容。本网发布或转载文章出于传递更多信息之目的,并不意味着赞同其观点或证实其描述,也不代表本网对其真实性负责。

左子网 编程相关 elasticsearch https://www.zuozi.net/33772.html

spring boot
上一篇: spring boot
PHP_CodeSniffer
下一篇: PHP_CodeSniffer
常见问题
  • 1、自动:拍下后,点击(下载)链接即可下载;2、手动:拍下后,联系卖家发放即可或者联系官方找开发者发货。
查看详情
  • 1、源码默认交易周期:手动发货商品为1-3天,并且用户付款金额将会进入平台担保直到交易完成或者3-7天即可发放,如遇纠纷无限期延长收款金额直至纠纷解决或者退款!;
查看详情
  • 1、描述:源码描述(含标题)与实际源码不一致的(例:货不对板); 2、演示:有演示站时,与实际源码小于95%一致的(但描述中有”不保证完全一样、有变化的可能性”类似显著声明的除外); 3、发货:不发货可无理由退款; 4、安装:免费提供安装服务的源码但卖家不履行的; 5、收费:价格虚标,额外收取其他费用的(但描述中有显著声明或双方交易前有商定的除外); 6、其他:如质量方面的硬性常规问题BUG等。 注:经核实符合上述任一,均支持退款,但卖家予以积极解决问题则除外。
查看详情
  • 1、左子会对双方交易的过程及交易商品的快照进行永久存档,以确保交易的真实、有效、安全! 2、左子无法对如“永久包更新”、“永久技术支持”等类似交易之后的商家承诺做担保,请买家自行鉴别; 3、在源码同时有网站演示与图片演示,且站演与图演不一致时,默认按图演作为纠纷评判依据(特别声明或有商定除外); 4、在没有”无任何正当退款依据”的前提下,商品写有”一旦售出,概不支持退款”等类似的声明,视为无效声明; 5、在未拍下前,双方在QQ上所商定的交易内容,亦可成为纠纷评判依据(商定与描述冲突时,商定为准); 6、因聊天记录可作为纠纷评判依据,故双方联系时,只与对方在左子上所留的QQ、手机号沟通,以防对方不承认自我承诺。 7、虽然交易产生纠纷的几率很小,但一定要保留如聊天记录、手机短信等这样的重要信息,以防产生纠纷时便于左子介入快速处理。
查看详情

相关文章

猜你喜欢
发表评论
暂无评论
官方客服团队

为您解决烦忧 - 24小时在线 专业服务