Go to file
projectmoon cd5a9a0fbd Merge remote-tracking branch 'origin/master' into ollama-auth 2024-07-16 08:56:41 +02:00
.assets Initial commit 2024-04-09 16:21:05 +05:30
.github Create FUNDING.yml 2024-06-30 12:34:32 +05:30
data feat(docker-compose): implement data volume 2024-06-29 11:10:26 +05:30
docs feat(docs): update docs 2024-06-29 11:39:23 +05:30
searxng feat(searxng-container): bind mount & add limiter 2024-05-10 20:55:08 +05:30
src Merge remote-tracking branch 'origin/master' into ollama-auth 2024-07-16 08:56:41 +02:00
ui feat(providers): add anthropic 2024-07-15 21:20:16 +05:30
.dockerignore feat(wolfram-search): Remove unused imports 2024-04-17 10:10:28 +05:30
.gitignore feat(docker-compose): implement data volume 2024-06-29 11:10:26 +05:30
.prettierignore feat(config): Use toml instead of env 2024-04-20 09:32:19 +05:30
.prettierrc.js Initial commit 2024-04-09 16:21:05 +05:30
CONTRIBUTING.md feat(agents): support local LLMs 2024-04-20 11:18:52 +05:30
LICENSE feat(license): Create license 2024-04-10 20:09:50 +05:30
README.md feat(providers): add anthropic 2024-07-15 21:20:16 +05:30
app.dockerfile feat(app): revert port & network changes 2024-05-13 19:58:17 +05:30
backend.dockerfile feat(dockerfile): revert base image back to slim 2024-07-06 15:13:05 +05:30
docker-compose.yaml feat(docker-compose): implement data volume 2024-06-29 11:10:26 +05:30
drizzle.config.ts feat(db): create schema & config files 2024-06-29 11:08:11 +05:30
package.json Merge remote-tracking branch 'origin/master' into ollama-auth 2024-07-16 08:56:41 +02:00
sample.config.toml feat(providers): add anthropic 2024-07-15 21:20:16 +05:30
tsconfig.json feat(embedding-providers): add local models 2024-05-07 11:52:53 +05:30
yarn.lock feat(providers): add anthropic 2024-07-15 21:20:16 +05:30

README.md

🚀 Perplexica - An AI-powered search engine 🔎

preview

Table of Contents

Overview

Perplexica is an open-source AI-powered searching tool or an AI-powered search engine that goes deep into the internet to find answers. Inspired by Perplexity AI, it's an open-source option that not just searches the web but understands your questions. It uses advanced machine learning algorithms like similarity searching and embeddings to refine results and provides clear answers with sources cited.

Using SearxNG to stay current and fully open source, Perplexica ensures you always get the most up-to-date information without compromising your privacy.

Want to know more about its architecture and how it works? You can read it here.

Preview

video-preview

Features

  • Local LLMs: You can make use local LLMs such as Llama3 and Mixtral using Ollama.
  • Two Main Modes:
    • Copilot Mode: (In development) Boosts search by generating different queries to find more relevant internet sources. Like normal search instead of just using the context by SearxNG, it visits the top matches and tries to find relevant sources to the user's query directly from the page.
    • Normal Mode: Processes your query and performs a web search.
  • Focus Modes: Special modes to better answer specific types of questions. Perplexica currently has 6 focus modes:
    • All Mode: Searches the entire web to find the best results.
    • Writing Assistant Mode: Helpful for writing tasks that does not require searching the web.
    • Academic Search Mode: Finds articles and papers, ideal for academic research.
    • YouTube Search Mode: Finds YouTube videos based on the search query.
    • Wolfram Alpha Search Mode: Answers queries that need calculations or data analysis using Wolfram Alpha.
    • Reddit Search Mode: Searches Reddit for discussions and opinions related to the query.
  • Current Information: Some search tools might give you outdated info because they use data from crawling bots and convert them into embeddings and store them in a index. Unlike them, Perplexica uses SearxNG, a metasearch engine to get the results and rerank and get the most relevant source out of it, ensuring you always get the latest information without the overhead of daily data updates.

It has many more features like image and video search. Some of the planned features are mentioned in upcoming features.

Installation

There are mainly 2 ways of installing Perplexica - With Docker, Without Docker. Using Docker is highly recommended.

  1. Ensure Docker is installed and running on your system.

  2. Clone the Perplexica repository:

    git clone https://github.com/ItzCrazyKns/Perplexica.git
    
  3. After cloning, navigate to the directory containing the project files.

  4. Rename the sample.config.toml file to config.toml. For Docker setups, you need only fill in the following fields:

    • OPENAI: Your OpenAI API key. You only need to fill this if you wish to use OpenAI's models.

    • OLLAMA: Your Ollama API URL. You should enter it as http://host.docker.internal:PORT_NUMBER. If you installed Ollama on port 11434, use http://host.docker.internal:11434. For other ports, adjust accordingly. You need to fill this if you wish to use Ollama's models instead of OpenAI's.

    • GROQ: Your Groq API key. You only need to fill this if you wish to use Groq's hosted models.

    • ANTHROPIC: Your Anthropic API key. You only need to fill this if you wish to use Anthropic models.

      Note: You can change these after starting Perplexica from the settings dialog.

    • SIMILARITY_MEASURE: The similarity measure to use (This is filled by default; you can leave it as is if you are unsure about it.)

  5. Ensure you are in the directory containing the docker-compose.yaml file and execute:

    docker compose up -d
    
  6. Wait a few minutes for the setup to complete. You can access Perplexica at http://localhost:3000 in your web browser.

Note: After the containers are built, you can start Perplexica directly from Docker without having to open a terminal.

Non-Docker Installation

  1. Install SearXNG and allow JSON format in the SearXNG settings.
  2. Clone the repository and rename the sample.config.toml file to config.toml in the root directory. Ensure you complete all required fields in this file.
  3. Rename the .env.example file to .env in the ui folder and fill in all necessary fields.
  4. After populating the configuration and environment files, run npm i in both the ui folder and the root directory.
  5. Install the dependencies and then execute npm run build in both the ui folder and the root directory.
  6. Finally, start both the frontend and the backend by running npm run start in both the ui folder and the root directory.

Note: Using Docker is recommended as it simplifies the setup process, especially for managing environment variables and dependencies.

See the installation documentation for more information like exposing it your network, etc.

Ollama Connection Errors

If you're encountering an Ollama connection error, it is likely due to the backend being unable to connect to Ollama's API. To fix this issue you can:

  1. Check your Ollama API URL: Ensure that the API URL is correctly set in the settings menu.

  2. Update API URL Based on OS:

    • Windows: Use http://host.docker.internal:11434
    • Mac: Use http://host.docker.internal:11434
    • Linux: Use http://<private_ip_of_host>:11434

    Adjust the port number if you're using a different one.

  3. Linux Users - Expose Ollama to Network:

    • Inside /etc/systemd/system/ollama.service, you need to add Environment="OLLAMA_HOST=0.0.0.0". Then restart Ollama by systemctl restart ollama. For more information see Ollama docs

    • Ensure that the port (default is 11434) is not blocked by your firewall.

Using as a Search Engine

If you wish to use Perplexica as an alternative to traditional search engines like Google or Bing, or if you want to add a shortcut for quick access from your browser's search bar, follow these steps:

  1. Open your browser's settings.
  2. Navigate to the 'Search Engines' section.
  3. Add a new site search with the following URL: http://localhost:3000/?q=%s. Replace localhost with your IP address or domain name, and 3000 with the port number if Perplexica is not hosted locally.
  4. Click the add button. Now, you can use Perplexica directly from your browser's search bar.

One-Click Deployment

Deploy to RepoCloud

Upcoming Features

  • Add settings page
  • Adding support for local LLMs
  • History Saving features
  • Introducing various Focus Modes
  • Finalizing Copilot Mode
  • Adding Discover

Support Us

If you find Perplexica useful, consider giving us a star on GitHub. This helps more people discover Perplexica and supports the development of new features. Your support is greatly appreciated.

Donations

We also accept donations to help sustain our project. If you would like to contribute, you can use the following options to donate. Thank you for your support!

Cards Ethereum
https://www.patreon.com/itzcrazykns Address: 0xB025a84b2F269570Eb8D4b05DEdaA41D8525B6DD

Contribution

Perplexica is built on the idea that AI and large language models should be easy for everyone to use. If you find bugs or have ideas, please share them in via GitHub Issues. For more information on contributing to Perplexica you can read the CONTRIBUTING.md file to learn more about Perplexica and how you can contribute to it.

Help and Support

If you have any questions or feedback, please feel free to reach out to us. You can create an issue on GitHub or join our Discord server. There, you can connect with other users, share your experiences and reviews, and receive more personalized help. Click here to join the Discord server. To discuss matters outside of regular support, feel free to contact me on Discord at itzcrazykns.

Thank you for exploring Perplexica, the AI-powered search engine designed to enhance your search experience. We are constantly working to improve Perplexica and expand its capabilities. We value your feedback and contributions which help us make Perplexica even better. Don't forget to check back for updates and new features!