Go to file
ItzCrazyKns 321e60b993
feat(embedding-providers): load separately, add bert & bge
2024-05-07 12:33:44 +05:30
.assets Initial commit 2024-04-09 16:21:05 +05:30
.github/ISSUE_TEMPLATE feat(config): Use toml instead of env 2024-04-20 09:32:19 +05:30
docs/architecture feat(message-actions): add speak message, bump version 2024-05-03 18:25:22 +05:30
src feat(embedding-providers): load separately, add bert & bge 2024-05-07 12:33:44 +05:30
ui fix(SettingDialog): use `value` instead of `selected` props in <select> 2024-05-07 06:35:39 +08:00
.dockerignore feat(wolfram-search): Remove unused imports 2024-04-17 10:10:28 +05:30
.gitignore feat(config): Use toml instead of env 2024-04-20 09:32:19 +05:30
.prettierignore feat(config): Use toml instead of env 2024-04-20 09:32:19 +05:30
.prettierrc.js Initial commit 2024-04-09 16:21:05 +05:30
CONTRIBUTING.md feat(agents): support local LLMs 2024-04-20 11:18:52 +05:30
LICENSE feat(license): Create license 2024-04-10 20:09:50 +05:30
README.md feat(readme): update readme 2024-05-06 13:00:07 +05:30
app.dockerfile Initial commit 2024-04-09 16:21:05 +05:30
backend.dockerfile feat(config): Use toml instead of env 2024-04-20 09:32:19 +05:30
docker-compose.yaml feat(agents): embed docs & query together 2024-04-24 10:08:40 +05:30
package.json feat(embedding-providers): add local models 2024-05-07 11:52:53 +05:30
sample.config.toml feat(chatModels): load model from localstorage 2024-05-02 12:14:26 +05:30
searxng-settings.yml feat(searxng-settings): drop unsupported engines 2024-04-28 19:14:02 +05:30
searxng.dockerfile Initial commit 2024-04-09 16:21:05 +05:30
tsconfig.json feat(embedding-providers): add local models 2024-05-07 11:52:53 +05:30
yarn.lock feat(embedding-providers): add local models 2024-05-07 11:52:53 +05:30

README.md

🚀 Perplexica - An AI-powered search engine 🔎

preview

Table of Contents

Overview

Perplexica is an open-source AI-powered searching tool or an AI-powered search engine that goes deep into the internet to find answers. Inspired by Perplexity AI, it's an open-source option that not just searches the web but understands your questions. It uses advanced machine learning algorithms like similarity searching and embeddings to refine results and provides clear answers with sources cited.

Using SearxNG to stay current and fully open source, Perplexica ensures you always get the most up-to-date information without compromising your privacy.

Want to know more about its architecture and how it works? You can read it here.

Preview

video-preview

Features

  • Local LLMs: You can make use local LLMs such as Llama3 and Mixtral using Ollama.
  • Two Main Modes:
    • Copilot Mode: (In development) Boosts search by generating different queries to find more relevant internet sources. Like normal search instead of just using the context by SearxNG, it visits the top matches and tries to find relevant sources to the user's query directly from the page.
    • Normal Mode: Processes your query and performs a web search.
  • Focus Modes: Special modes to better answer specific types of questions. Perplexica currently has 6 focus modes:
    • All Mode: Searches the entire web to find the best results.
    • Writing Assistant Mode: Helpful for writing tasks that does not require searching the web.
    • Academic Search Mode: Finds articles and papers, ideal for academic research.
    • YouTube Search Mode: Finds YouTube videos based on the search query.
    • Wolfram Alpha Search Mode: Answers queries that need calculations or data analysis using Wolfram Alpha.
    • Reddit Search Mode: Searches Reddit for discussions and opinions related to the query.
  • Current Information: Some search tools might give you outdated info because they use data from crawling bots and convert them into embeddings and store them in a index. Unlike them, Perplexica uses SearxNG, a metasearch engine to get the results and rerank and get the most relevant source out of it, ensuring you always get the latest information without the overhead of daily data updates.

It has many more features like image and video search. Some of the planned features are mentioned in upcoming features.

Installation

There are mainly 2 ways of installing Perplexica - With Docker, Without Docker. Using Docker is highly recommended.

  1. Ensure Docker is installed and running on your system.

  2. Clone the Perplexica repository:

    git clone https://github.com/ItzCrazyKns/Perplexica.git
    
  3. After cloning, navigate to the directory containing the project files.

  4. Rename the sample.config.toml file to config.toml. For Docker setups, you need only fill in the following fields:

    • OPENAI: Your OpenAI API key. You only need to fill this if you wish to use OpenAI's models.

    • OLLAMA: Your Ollama API URL. You should enter it as http://host.docker.internal:PORT_NUMBER. If you installed Ollama on port 11434, use http://host.docker.internal:11434. For other ports, adjust accordingly. You need to fill this if you wish to use Ollama's models instead of OpenAI's.

    • GROQ: Your Groq API key. You only need to fill this if you wish to use Groq's hosted models

      Note: You can change these after starting Perplexica from the settings dialog.

    • SIMILARITY_MEASURE: The similarity measure to use (This is filled by default; you can leave it as is if you are unsure about it.)

  5. Ensure you are in the directory containing the docker-compose.yaml file and execute:

    docker compose up -d
    
  6. Wait a few minutes for the setup to complete. You can access Perplexica at http://localhost:3000 in your web browser.

Note: After the containers are built, you can start Perplexica directly from Docker without having to open a terminal.

Non-Docker Installation

  1. Clone the repository and rename the sample.config.toml file to config.toml in the root directory. Ensure you complete all required fields in this file.
  2. Rename the .env.example file to .env in the ui folder and fill in all necessary fields.
  3. After populating the configuration and environment files, run npm i in both the ui folder and the root directory.
  4. Install the dependencies and then execute npm run build in both the ui folder and the root directory.
  5. Finally, start both the frontend and the backend by running npm run start in both the ui folder and the root directory.

Note: Using Docker is recommended as it simplifies the setup process, especially for managing environment variables and dependencies.

Ollama connection errors

If you're facing an Ollama connection error, it is often related to the backend not being able to connect to Ollama's API. How can you fix it? You can fix it by updating your Ollama API URL in the settings menu to the following:

On Windows: http://host.docker.internal:11434
On Mac: http://host.docker.internal:11434
On Linux: http://private_ip_of_computer_hosting_ollama:11434

You need to edit the ports accordingly.

One-Click Deployment

Deploy to RepoCloud

Upcoming Features

  • Finalizing Copilot Mode
  • Add settings page
  • Adding support for local LLMs
  • Adding Discover and History Saving features
  • Introducing various Focus Modes

Support Us

If you find Perplexica useful, consider giving us a star on GitHub. This helps more people discover Perplexica and supports the development of new features. Your support is appreciated.

Contribution

Perplexica is built on the idea that AI and large language models should be easy for everyone to use. If you find bugs or have ideas, please share them in via GitHub Issues. For more information on contributing to Perplexica you can read the CONTRIBUTING.md file to learn more about Perplexica and how you can contribute to it.

Help and Support

If you have any questions or feedback, please feel free to reach out to us. You can create an issue on GitHub or join our Discord server. There, you can connect with other users, share your experiences and reviews, and receive more personalized help. Click here to join the Discord server. To discuss matters outside of regular support, feel free to contact me on Discord at itzcrazykns.

Thank you for exploring Perplexica, the AI-powered search engine designed to enhance your search experience. We are constantly working to improve Perplexica and expand its capabilities. We value your feedback and contributions which help us make Perplexica even better. Don't forget to check back for updates and new features!