mirror of
https://github.com/foss42/apidash.git
synced 2025-12-08 22:20:44 +08:00
Merge branch 'foss42:main' into main
This commit is contained in:
17
README.md
17
README.md
@@ -1,6 +1,19 @@
|
||||
# API Dash ⚡️
|
||||
|
||||
[](https://bit.ly/heyfoss)
|
||||
[](https://discord.com/invite/bBeSdtJ6Ue)
|
||||
|
||||
### 🚨🚨 API Dash is participating in GSoC 2025! Check out the details below:
|
||||
|
||||
<img src="https://github.com/foss42/apidash/assets/615622/493ce57f-06c3-4789-b7ae-9fa63bca8183" alt="GSoC" width="500">
|
||||
|
||||
| | Link |
|
||||
|--|--|
|
||||
| Learn about GSoC | [Link](https://summerofcode.withgoogle.com) |
|
||||
| API Dash GSoC Page | [Link](https://summerofcode.withgoogle.com/programs/2025/organizations/api-dash) |
|
||||
| Project Ideas List | [Link](https://github.com/foss42/apidash/discussions/565) |
|
||||
| Application Guide | [Link](https://github.com/foss42/apidash/discussions/564) |
|
||||
| Discord Channel | [Link](https://discord.com/invite/bBeSdtJ6Ue) |
|
||||
|
||||
|
||||
### Please support this initiative by giving this project a Star ⭐️
|
||||
|
||||
@@ -277,4 +290,4 @@ You can contribute to API Dash in any or all of the following ways:
|
||||
|
||||
## Need Any Help?
|
||||
|
||||
In case you need any help with API Dash or are encountering any issue while running the tool, please feel free to drop by our [Discord server](https://bit.ly/heyfoss) and we can have a chat in the **#foss-apidash** channel.
|
||||
In case you need any help with API Dash or are encountering any issue while running the tool, please feel free to drop by our [Discord server](https://discord.com/invite/bBeSdtJ6Ue) and we can have a chat in the **#foss-apidash** channel.
|
||||
|
||||
@@ -0,0 +1,83 @@
|
||||
# AI-Powered API Testing and Tool Integration
|
||||
|
||||
## Personal Information
|
||||
|
||||
- **Full Name:** Debasmi Basu
|
||||
- **Email:** [basudebasmi2006@gmail.com](mailto:basudebasmi2006@gmail.com)
|
||||
- **Phone:** +91 7439640610
|
||||
- **Discord Handle:** debasmibasu
|
||||
- **Home Page:** [Portfolio](https://debasmi.github.io/portfolio/portfolio.html)
|
||||
- **GitHub Profile:** [Debasmi](https://github.com/debasmi)
|
||||
- **Socials:**
|
||||
- [LinkedIn](https://www.linkedin.com/in/debasmi-basu-513726288/)
|
||||
- **Time Zone:** Indian Standard Time
|
||||
- **Resume:** [Google Drive Link](https://drive.google.com/file/d/1o5JxOwneK-jv2GxnKTrzk__n7UbSKTPt/view?usp=sharing)
|
||||
|
||||
## University Info
|
||||
|
||||
- **University Name:** Cluster Innovation Centre, University of Delhi
|
||||
- **Program:** B.Tech. in Information Technology and Mathematical Innovations
|
||||
- **Year:** 2023 - Present
|
||||
- **Expected Graduation Date:** 2027
|
||||
|
||||
## Motivation & Past Experience
|
||||
|
||||
### Project of Pride: Image Encryption using Quantum Computing Algorithms
|
||||
|
||||
This project represents my most significant achievement in the field of quantum computing and cybersecurity. I developed a **quantum image encryption algorithm** using **Qiskit**, leveraging quantum superposition and entanglement to enhance security. By implementing the **NEQR model**, I ensured **100% accuracy in encryption**, preventing any data loss. Additionally, I designed **advanced quantum circuit techniques** to reduce potential decryption vulnerabilities, pushing the boundaries of modern encryption methods.
|
||||
|
||||
This project is my pride because it merges **cutting-edge quantum computing** with **practical data security applications**, demonstrating the **real-world potential of quantum algorithms in cryptography**. It reflects my deep technical expertise in **Qiskit, Python, and quantum circuits**, as well as my passion for exploring **future-proof encryption solutions**.
|
||||
|
||||
### Challenges that Motivate Me
|
||||
|
||||
I am driven by challenges that push the boundaries of **emerging technologies, security, and web development**. The intersection of **AI, cybersecurity, web applications, and quantum computing** excites me because of its potential to redefine **secure digital interactions**. My passion lies in building **robust, AI-powered automation systems** that enhance **security, efficiency, and accessibility** in real-world applications. Additionally, I enjoy working on **scalable web solutions**, ensuring that modern applications remain secure and user-friendly.
|
||||
|
||||
### Availability for GSoC
|
||||
|
||||
- **Will work full-time on GSoC.**
|
||||
- I will also dedicate time to exploring **LLM-based security frameworks**, improving **web API integration**, and enhancing my expertise in **AI-driven automation**.
|
||||
|
||||
### Regular Sync-Ups
|
||||
|
||||
- **Yes.** I am committed to maintaining **regular sync-ups** with mentors to ensure steady project progress and discuss improvements in API security and automation.
|
||||
|
||||
### Interest in API Dash
|
||||
|
||||
- The potential to integrate **AI-powered automation** for API testing aligns perfectly with my expertise in **web development, backend integration, and security automation**.
|
||||
- I see a great opportunity in **enhancing API security validation** using AI-driven techniques, ensuring robust **schema validation and intelligent error detection**.
|
||||
|
||||
### Areas for Improvement
|
||||
|
||||
- API Dash can expand **real-time collaborative testing features**, allowing teams to test and debug APIs more efficiently.
|
||||
- Enhancing **security automation** by integrating **AI-powered API monitoring** would significantly improve API Dash’s effectiveness.
|
||||
|
||||
---
|
||||
|
||||
## Project Proposal
|
||||
|
||||
### **Title**
|
||||
|
||||
AI-Powered API Testing and Tool Integration
|
||||
|
||||
### **Abstract**
|
||||
|
||||
API testing often requires **manual test case creation and validation**, making it inefficient. Additionally, **converting APIs into structured definitions for AI integration** is a complex task. This project aims to **automate test generation, response validation, and structured API conversion** using **LLMs and AI agents.** The system will provide **automated debugging insights** and integrate seamlessly with **pydantic-ai** and **langgraph.** A **benchmarking dataset** will also be created to evaluate various LLMs for API testing tasks.
|
||||
|
||||
### **Weekly Timeline**
|
||||
|
||||
| Week | Focus | Key Deliverables & Achievements |
|
||||
|---------------|--------------------------------|------------------------------------------------------------------------|
|
||||
| **Week 1-2** | Research & Architecture | Study existing API testing tools, research AI automation methods, explore web-based API testing interfaces, and define the project architecture. Expected Outcome: Clear technical roadmap for implementation. |
|
||||
| **Week 3-4** | API Specification Parsing | Develop a parser to extract API endpoints, request methods, authentication requirements, and response formats from OpenAPI specs, Postman collections, and raw API logs. Expected Outcome: Functional API parser capable of structured data extraction and visualization. |
|
||||
| **Week 5-6** | AI-Based Test Case Generation | Implement an AI model that analyzes API specifications and generates valid test cases, including edge cases and error scenarios. Expected Outcome: Automated test case generation covering standard, edge, and security cases, integrated into a web-based UI. |
|
||||
| **Week 7-8** | Response Validation & Debugging | Develop an AI-powered validation mechanism that checks API responses against expected schemas and detects inconsistencies. Implement logging and debugging tools within a web dashboard to provide insights into API failures. Expected Outcome: AI-driven validation tool with intelligent debugging support. |
|
||||
| **Week 9-10** | Structured API Conversion | Design a system that converts APIs into structured tool definitions compatible with pydantic-ai and langgraph, ensuring seamless AI agent integration. Expected Outcome: Automated conversion of API specs into structured tool definitions, with visual representation in a web-based interface. |
|
||||
| **Week 11-12**| Benchmarking & Evaluation | Create a dataset and evaluation framework to benchmark different LLMs for API testing performance. Conduct performance testing on generated test cases and validation mechanisms. Expected Outcome: A benchmarking dataset and comparative analysis of LLMs in API testing tasks, integrated into a web-based reporting system. |
|
||||
| **Final Week**| Testing & Documentation | Perform comprehensive end-to-end testing, finalize documentation, create usage guides, and submit the final project report. Expected Outcome: Fully tested, documented, and ready-to-use AI-powered API testing framework, with a web-based dashboard for interaction and reporting. |
|
||||
|
||||
---
|
||||
|
||||
## Conclusion
|
||||
|
||||
This project will significantly **enhance API testing automation** by leveraging **AI-driven test generation, web-based API analysis, and structured tool conversion**. The benchmarking dataset will provide **a standard evaluation framework** for API testing LLMs, ensuring **optimal model selection for API validation**. The resulting **AI-powered API testing framework** will improve **efficiency, accuracy, security, and scalability**, making API Dash a more powerful tool for developers.
|
||||
|
||||
129
doc/proposals/2025/gsoc/idea_NingWei_AI UI Designer for APIs.md
Normal file
129
doc/proposals/2025/gsoc/idea_NingWei_AI UI Designer for APIs.md
Normal file
@@ -0,0 +1,129 @@
|
||||
# GSoC 2025 Proposal: AI UI Designer for APIs
|
||||
|
||||
## About
|
||||
|
||||
**Full Name**: Ning Wei
|
||||
**Contact Info**: Allenwei0503@gmail.com
|
||||
**Discord Handle**: @allen_wn
|
||||
**GitHub Profile**: [https://github.com/AllenWn](https://github.com/AllenWn)
|
||||
**LinkedIn**: [https://www.linkedin.com/in/ning-wei-allen0503](https://www.linkedin.com/in/ning-wei-allen0503)
|
||||
**Time Zone**: UTC+8
|
||||
**Resume**: https://drive.google.com/file/d/1Zvf1IhKju3rFfnDsBW1WmV40lz0ZMNrD/view?usp=sharing
|
||||
|
||||
## University Info
|
||||
|
||||
**University**: University of Illinois at Urbana-Champaign
|
||||
**Program**: B.S. in Computer Engineering
|
||||
**Year**: 2nd year undergraduate
|
||||
**Expected Graduation**: May 2027
|
||||
|
||||
---
|
||||
|
||||
## Motivation & Past Experience
|
||||
|
||||
1. **Have you worked on or contributed to a FOSS project before?**
|
||||
Not yet officially, but I’ve been actively exploring open source projects like API Dash and contributing via discussion and design planning. I am currently studying the API Dash repository and developer guide to prepare for my first PR.
|
||||
|
||||
2. **What is your one project/achievement that you are most proud of? Why?**
|
||||
I'm proud of building an AI-assisted email management app using Flutter and Go, which automatically categorized and responded to emails using ChatGPT API. It gave me end-to-end experience in integrating APIs, generating dynamic UIs, and designing developer-friendly tools.
|
||||
|
||||
3. **What kind of problems or challenges motivate you the most to solve them?**
|
||||
I enjoy solving problems that eliminate repetitive work for developers and improve workflow productivity — especially through automation and AI integration.
|
||||
|
||||
4. **Will you be working on GSoC full-time?**
|
||||
Yes. I will be dedicating full-time to this project during the summer.
|
||||
|
||||
5. **Do you mind regularly syncing up with the project mentors?**
|
||||
Not at all — I look forward to regular syncs and feedback to align with the project vision.
|
||||
|
||||
6. **What interests you the most about API Dash?**
|
||||
API Dash is focused on improving the developer experience around APIs, which is something I care deeply about. I love the vision of combining UI tools with AI assistance in a privacy-first, extensible way.
|
||||
|
||||
7. **Can you mention some areas where the project can be improved?**
|
||||
- More intelligent code generation from API response types
|
||||
- Drag-and-drop UI workflow
|
||||
- Visual previews and theming customization
|
||||
- Integration with modern LLMs for field-level naming and layout suggestions
|
||||
|
||||
---
|
||||
|
||||
## Project Proposal Information
|
||||
|
||||
### Proposal Title
|
||||
|
||||
AI UI Designer for APIs
|
||||
|
||||
# Relevant Issues: [#617]
|
||||
|
||||
### Abstract
|
||||
|
||||
This project aims to develop an AI-powered assistant within API Dash that automatically generates dynamic user interfaces (UI) based on API responses (JSON/XML). The goal is to allow developers to instantly visualize, customize, and export usable Flutter UI code from raw API data. The generated UI should adapt to the structure of the API response and be interactive, with features like sorting, filtering, and layout tweaking. This tool will streamline frontend prototyping and improve developer productivity.
|
||||
|
||||
---
|
||||
|
||||
### Detailed Description
|
||||
|
||||
The AI UI Designer will be a new feature integrated into the API Dash interface, triggered by a button after an API response is received. It will analyze the data and suggest corresponding UI layouts using Dart/Flutter widgets such as `DataTable`, `Card`, or `Form`.
|
||||
|
||||
#### Step 1: Parse API Response Structure
|
||||
|
||||
- Focus initially on JSON (XML can be added later)
|
||||
- Build a recursive parser to convert the API response into a schema-like tree
|
||||
- Extract field types, array/object structure, nesting depth
|
||||
- Identify patterns (e.g., timestamps, prices, lists)
|
||||
|
||||
#### Step 2: Design AI Agent Logic
|
||||
|
||||
- Use a rule-based system to map schema to UI components
|
||||
- List of objects → Table
|
||||
- Simple object → Card/Form
|
||||
- Number over time → Line Chart (optional)
|
||||
- Integrate LLM backend (e.g., Ollama, GPT API) to enhance:
|
||||
- Field labeling
|
||||
- Layout suggestion
|
||||
- Component naming
|
||||
|
||||
#### Step 3: Generate UI in Flutter
|
||||
|
||||
- Dynamically generate:
|
||||
- `DataTable`, `Card`, `TextField`, `Dropdown`, etc.
|
||||
- Optional chart widgets (e.g., `fl_chart`)
|
||||
- Support:
|
||||
- Layout rearrangement (form-based or drag-drop)
|
||||
- Field visibility toggles
|
||||
- Previewing final UI
|
||||
|
||||
#### Step 4: Export UI Code
|
||||
|
||||
- Export generated layout as Dart code
|
||||
- Allow download or copy-to-clipboard
|
||||
- Support JSON config export (optional for renderer-based architecture)
|
||||
|
||||
#### Step 5: Integrate into API Dash
|
||||
|
||||
- Add AI UI Designer button in the API response view
|
||||
- Launch UI editing pane inside app
|
||||
- Ensure local-only, privacy-friendly execution
|
||||
- Write tests, docs, and polish UX
|
||||
|
||||
---
|
||||
|
||||
## Weekly Timeline (Tentative)
|
||||
|
||||
| Week | Milestone |
|
||||
|------|-----------|
|
||||
| Community Bonding | Join Discord, interact with mentors, finalize approach, get feedback |
|
||||
| Week 1–2 | Build and test JSON parser → generate basic schema |
|
||||
| Week 3–4 | Implement rule-based UI mapper; generate simple widgets |
|
||||
| Week 5–6 | Integrate initial Flutter component generator; allow basic UI previews |
|
||||
| Week 7 | Midterm Evaluation |
|
||||
| Week 8–9 | Add customization options (visibility, layout) |
|
||||
| Week 10 | Integrate AI backend (e.g., Ollama/GPT) for suggestions |
|
||||
| Week 11–12 | Add export functions (code, JSON config) |
|
||||
| Week 13 | Final polish, tests, docs |
|
||||
| Week 14 | Final Evaluation, feedback, and delivery |
|
||||
|
||||
---
|
||||
|
||||
Thanks again for your time and guidance. I’ve already started studying the API Dash codebase and developer guide, and I’d love your feedback on this plan — does it align with your vision?
|
||||
If selected, I’m excited to implement this project. If this idea is already taken, I’m open to switching to another API Dash project that fits my background.
|
||||
101
doc/proposals/2025/gsoc/idea_balasubramaniam_api_explorer.md
Normal file
101
doc/proposals/2025/gsoc/idea_balasubramaniam_api_explorer.md
Normal file
@@ -0,0 +1,101 @@
|
||||
# **Initial Idea Submission : API Explorer**
|
||||
|
||||
**Full Name:** BALASUBRAMANIAM L
|
||||
**University Name:** Saveetha Engineering College
|
||||
**Program (Degree & Major/Minor):** Bachelor of technology Machine Learning
|
||||
**Year:** First year
|
||||
**Expected Graduation Date:** 2028
|
||||
|
||||
**Project Title:** API Explorer
|
||||
**Relevant Issues:** [https://github.com/foss42/apidash/issues/619](https://github.com/foss42/apidash/issues/619)
|
||||
|
||||
## **Project Overview**
|
||||
|
||||
Our goal is to enhance API Dash by adding an API Explorer feature. This feature allows users to discover, browse, search, and import pre-configured API endpoints for testing and exploration. All API templates will be maintained in YAML, JSON, HTML, and Markdown formats within a dedicated folder in the existing Apidash GitHub repository.
|
||||
|
||||
In the initial phase, contributors can manually add new API definition files (YAML, JSON, HTML, and MD) to the repo, run a local Dart script to process them into structured JSON format, and then commit and push the updated files. A Dart cron job will periodically check for new or modified API files and process them automatically. In the future, we plan to automate this process fully with GitHub Actions.
|
||||
|
||||
---
|
||||
|
||||
### **Key Concepts**
|
||||
|
||||
- **File Addition:**
|
||||
Contributors add new API files (YAML, JSON, HTML, or MD) to a designated folder (`/apis/`) in the Apidash repository.
|
||||
|
||||
- **Local Processing:**
|
||||
A local Dart script (e.g., `process_apis.dart`) runs to:
|
||||
- Read the files.
|
||||
- Parse and extract essential API details (title, description, endpoints, etc.).
|
||||
- Auto-generate sample payloads when examples are missing.
|
||||
- Convert and save the processed data as JSON files in `/api_templates/`.
|
||||
|
||||
- **Automated Fetching & Processing with Dart Cron Job:**
|
||||
- A Dart cron-like package will schedule the script to fetch and process **new and updated** API files **weekly or on demand**.
|
||||
- This reduces the need for constant manual execution and ensures templates stay up to date.
|
||||
|
||||
- **Version Control:**
|
||||
Contributors create a PR with both the raw YAML files and the generated JSON files to GitHub.
|
||||
|
||||
- **Offline Caching with Hive:**
|
||||
- The Flutter app (API Explorer) will fetch JSON templates and store them using **Hive**.
|
||||
- This ensures **fast loading and offline access**.
|
||||
|
||||
- **Fetching Updates via GitHub Releases (ZIP files):**
|
||||
- Instead of fetching updates via the GitHub API (which has rate limits), we can leverage **GitHub Releases**.
|
||||
- A new release will be created weekly or when at least 10 updates are made.
|
||||
- The Flutter app will download and extract the latest ZIP release instead of making multiple API calls.
|
||||
|
||||
---
|
||||
|
||||
### **Step-by-Step Workflow**
|
||||
|
||||
1. **Adding API Files:**
|
||||
- A contributor creates or updates an API file (e.g., `weather.yaml`) in the `/apis/` folder.
|
||||
|
||||
2. **Running the Local Processing Script (Manually):**
|
||||
- A Dart script (`process_apis.dart`) is executed locally:
|
||||
`dart run process_apis.dart`
|
||||
- The script:
|
||||
- Reads YAML files from `/apis/`.
|
||||
- Identifies the file format (YAML, JSON, HTML, MD).
|
||||
- Parses the content accordingly.
|
||||
- Extracts essential API details (title, description, endpoints, etc.).
|
||||
- Generates structured JSON templates in `/api_templates/`.
|
||||
|
||||
3. **Review, Commit, and PR Submission:**
|
||||
- Contributors review the generated JSON files.
|
||||
- They commit both raw API definition files and generated JSON files.
|
||||
- Submit a **Pull Request (PR)** for review.
|
||||
|
||||
4. **Offline Storage with Hive (Flutter Frontend):**
|
||||
- The Flutter app fetches JSON templates and stores them in Hive.
|
||||
- This ensures users can access API templates even when offline.
|
||||
|
||||
5. **Fetching Updates via GitHub Releases:**
|
||||
- A new **GitHub Release** (ZIP) will be created weekly or when at least 10 updates are made.
|
||||
- The Flutter app will **download and extract** the latest ZIP instead of making multiple API calls.
|
||||
- This approach avoids GitHub API rate limits and ensures a smooth user experience.
|
||||
|
||||
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## **Future Automation with GitHub Actions**
|
||||
|
||||
In the future, we can fully automate this process:
|
||||
|
||||
- A GitHub Action will trigger on updates to `/apis/`.
|
||||
- It will run the Dart processing script automatically.
|
||||
- The action will commit the updated JSON templates back to the repository.
|
||||
- A GitHub Release will be generated periodically to bundle processed files for easier access.
|
||||
- This ensures **continuous and consistent updates** without manual intervention.
|
||||
|
||||
---
|
||||
|
||||
## **Conclusion**
|
||||
|
||||
This Approach provides a simple and controlled method for processing API definitions. The use of a **Dart cron job** reduces manual effort by fetching and processing updates on a scheduled basis, while **Hive storage** ensures fast offline access in the Flutter app. Using **GitHub Releases (ZIP)** allows fetching updates efficiently without hitting rate limits. Once validated, we can transition to **GitHub Actions** for complete automation. This approach aligns well with our project goals and scalability needs.
|
||||
|
||||
**I look forward to your feedback and suggestions on this approach. Thank you!**
|
||||
BIN
doc/proposals/2025/gsoc/images/API_EXPLORER_WORKFLOW.png
Normal file
BIN
doc/proposals/2025/gsoc/images/API_EXPLORER_WORKFLOW.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 169 KiB |
@@ -945,11 +945,129 @@ TODO
|
||||
|
||||
## Rust (reqwest)
|
||||
|
||||
TODO
|
||||
### 1. Download and Install Rust:
|
||||
|
||||
#### **Windows**
|
||||
1. Download and install `rustup` from [Rustup Official Site](https://rustup.rs).
|
||||
2. Run the installer (`rustup-init.exe`) and follow the instructions.
|
||||
3. Restart your terminal (Command Prompt or PowerShell).
|
||||
4. Verify the installation:
|
||||
```sh
|
||||
rustc --version
|
||||
cargo --version
|
||||
```
|
||||
|
||||
#### **MacOS/Linux**
|
||||
1. Run the following in your terminal:
|
||||
```sh
|
||||
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
|
||||
```
|
||||
then follow the on-screen instructions.
|
||||
|
||||
2. Restart the terminal and verify:
|
||||
```sh
|
||||
rustc --version
|
||||
cargo --version
|
||||
```
|
||||
|
||||
> Note: If you prefer not to use rustup for some reason, please see the Other [Rust Installation Methods](https://forge.rust-lang.org/infra/other-installation-methods.html) page for more options.
|
||||
|
||||
### 2. Set Up a New Rust Project
|
||||
1. Open a terminal and create a new Rust project:
|
||||
```sh
|
||||
cargo new reqwest-demo
|
||||
```
|
||||
2. Navigate into the project directory:
|
||||
```sh
|
||||
cd reqwest-demo
|
||||
```
|
||||
|
||||
or open this project directory in your preferred code editor.
|
||||
|
||||
### 3. Add Necessary Dependencies
|
||||
Run the following command to add dependencies:
|
||||
```sh
|
||||
cargo add reqwest --features blocking,json
|
||||
cargo add tokio --features full
|
||||
```
|
||||
- `"blocking"`: Enables synchronous requests.
|
||||
- `"json"`: Allows JSON parsing.
|
||||
- `"tokio"`: Needed for asynchronous execution.
|
||||
|
||||
Run the following command to fetch dependencies:
|
||||
```sh
|
||||
cargo build
|
||||
```
|
||||
|
||||
### 4. Execute code
|
||||
1. Copy the generated code from API Dash.
|
||||
2. Paste the code into your project's `src/main.rs` directory
|
||||
|
||||
Run the generated code:
|
||||
```sh
|
||||
cargo run
|
||||
```
|
||||
|
||||
## Rust (ureq)
|
||||
|
||||
TODO
|
||||
### 1. Download and Install Rust:
|
||||
|
||||
#### **Windows**
|
||||
1. Download and install `rustup` from [Rustup Official Site](https://rustup.rs).
|
||||
2. Run the installer (`rustup-init.exe`) and follow the instructions.
|
||||
3. Restart your terminal (Command Prompt or PowerShell).
|
||||
4. Verify the installation:
|
||||
```sh
|
||||
rustc --version
|
||||
cargo --version
|
||||
```
|
||||
|
||||
#### **MacOS/Linux**
|
||||
1. Run the following in your terminal:
|
||||
```sh
|
||||
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
|
||||
```
|
||||
then follow the on-screen instructions.
|
||||
|
||||
2. Restart the terminal and verify:
|
||||
```sh
|
||||
rustc --version
|
||||
cargo --version
|
||||
```
|
||||
|
||||
> Note: If you prefer not to use rustup for some reason, please see the Other [Rust Installation Methods](https://forge.rust-lang.org/infra/other-installation-methods.html) page for more options.
|
||||
|
||||
### 2. Set Up a New Rust Project
|
||||
1. Open a terminal and create a new Rust project:
|
||||
```sh
|
||||
cargo new ureq-demo
|
||||
```
|
||||
2. Navigate into the project directory:
|
||||
```sh
|
||||
cd ureq-demo
|
||||
```
|
||||
|
||||
or open this project directory in your preferred code editor.
|
||||
|
||||
### 3. Add `ureq` Dependency
|
||||
|
||||
Run the following command to add dependencies:
|
||||
```sh
|
||||
cargo add ureq
|
||||
```
|
||||
Run the following command to fetch dependencies:
|
||||
```sh
|
||||
cargo build
|
||||
```
|
||||
|
||||
### 4. Execute code
|
||||
1. Copy the generated code from API Dash.
|
||||
2. Paste the code into your project's `src/main.rs` directory
|
||||
|
||||
Run the generated code:
|
||||
```sh
|
||||
cargo run
|
||||
```
|
||||
|
||||
## Rust (Actix Client)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user