Merge branch 'main' into ChatBotFeature

This commit is contained in:
SRINATH REDDY
2025-02-23 10:55:25 +05:30
committed by GitHub
5 changed files with 148 additions and 64 deletions

View File

@ -11,12 +11,29 @@ You can contribute to the project in any or all of the following ways:
- Add documentation
- Add a new feature, resolve an existing issue or add a new test to the project. (Goto [Code Contribution Guidelines](#code-contribution-guidelines)).
## Resources for New Contributors
- API Dash Code Walkthrough - [Video](https://www.youtube.com/live/rIlwCTKNz-A?si=iMxTxzkpY_ySo4Ow&t=339)
- Getting Started with Flutter - [Video](https://www.youtube.com/watch?v=8K2gV1P6ZHI)
- API Dash Developer Guide - [Read](https://github.com/foss42/apidash/blob/main/doc/dev_guide/README.md)
## Code Contribution Guidelines
### Why we do not assign issues to anyone?
- By not assigning issues upfront, anyone can feel welcome to contribute without feeling like the issue is already "taken."
- This also prevents discouraging new contributors who might feel locked out if issues are pre-assigned.
- Contributors are encouraged to pick issues that align with their skills and interests. To take initiative rather than waiting for permission or being "assigned" work.
- Sometimes contributors express interest but never follow through. If issues are assigned prematurely, others might avoid working on them, delaying progress.
- Leaving issues unassigned ensures that work can proceed without bottlenecks if someone goes inactive.
- Open issues encourage community discussion and brainstorming. Prematurely assigning an issue can stifle input from others who might have better ideas or solutions.
- As open-source work is often voluntary, and contributors' availability can change. Keeping issues unassigned allows anyone to step in if the original contributor becomes unavailable.
This also supports multiple contributors collaborating on larger or complex issues.
### I have not contributed to any open source project before. Will I get any guidance?
In case you are new to the open source ecosystem, we would be more than happy to guide you through the entire process. Just join our [Discord server](https://bit.ly/heyfoss) and drop a message in the **#foss** channel.
## Code Contribution Guidelines
### Some things to keep in mind before opening a PR
> PRs with precise changes (like adding a new test, resolving a bug/issue, adding a new feature) are always preferred over a single PR with a ton of file changes as they are easier to review and merge.

View File

@ -25,9 +25,24 @@ You can read more [here](https://docs.flutter.dev/platform-integration/macos/bui
In case you are having a local build failure on macOS due to "audio_session" do check out issue https://github.com/foss42/apidash/issues/510 for solution.
## Android (Work in Progress)
## Android
Add the `multiDexEnabled true` line to the `defaultConfig` section at `android/app/build.gradle file`
In case you are targeting the Android API level <21 or the project and the libraries it references exceed 65,536 methods, you encounter the following build error that indicates your app has reached the limit of the Android build architecture:
```
trouble writing output:
Too many field references: 131000; max is 65536.
You may try using --multi-dex option.
```
OR
```
Conversion to Dalvik format failed:
Unable to execute dex: method ID not in [0, 0xffff]: 65536
```
To solve this problem, add the `multiDexEnabled true` line to the `defaultConfig` section in `android/app/build.gradle file`
```
android {
@ -39,7 +54,38 @@ android {
}
```
For more information on multidex support, you can refer to the Android developer guide on [Configuring Multidex](https://developer.android.com/studio/build/multidex).
For more information on multidex support, you can refer to the Android developer guide on [Configuring Multidex](https://developer.android.com/studio/build/multidex).
If you are experiencing build failure issues while debugging due to Gradle/JDK/AGP version resolving try upgrading the gradle version by CLI command
```
gradle wrapper --gradle-version <latest compatible version>
```
In case the above command fails, edit the Gradle distribution reference in the `gradle/wrapper/gradle-wrapper.properties` file. The following example sets the Gradle version to 8.8 in the `gradle-wrapper.properties` file.
```
...
distributionUrl = https\://services.gradle.org/distributions/gradle-8.8-bin.zip
...
```
Upgrade AGP by specifying the plugin version in the top-level `build.gradle` file. The following example sets the plugin to version 8.8.0 from the `build.gradle` file:
```
plugins {
...
id 'com.android.application' version '8.8.0' apply false
id 'com.android.library' version '8.8.0' apply false
...
}
```
For more information on:
- Gradle and Java version compatibility, you can refer to [Compatibility Matrix](https://docs.gradle.org/current/userguide/compatibility.html).
- Gradle and Android Gradle Plugin compatibility, you can refer to [Update Gradle](https://developer.android.com/build/releases/gradle-plugin).
Note : It is highly recommended that always ensure gradle and agp versions are compatible with your JDK version not the vice-versa and having atleast JDK 17 is recommmended.
## Web

View File

@ -54,8 +54,7 @@ class ResponseDetails extends ConsumerWidget {
.watch(selectedRequestModelProvider.select((value) => value?.message));
final responseModel = ref.watch(selectedRequestModelProvider
.select((value) => value?.httpResponseModel));
return Column(
children: [
ResponsePaneHeader(

View File

@ -2,15 +2,15 @@ import 'dart:convert';
import 'package:ollama_dart/ollama_dart.dart';
class OllamaService {
final OllamaClient _client;
final OllamaClient _client;
OllamaService() : _client = OllamaClient(baseUrl: 'http://127.0.0.1:11434/api');
OllamaService() : _client = OllamaClient(baseUrl: 'http://127.0.0.1:11434/api');
// Generate response
Future<String> generateResponse(String prompt) async {
final response = await _client.generateCompletion(
request: GenerateCompletionRequest(
model: 'llama3.2:3b',
model: 'llama3.2:1b',
prompt: prompt
),
);
@ -100,6 +100,42 @@ Analysis: [structured analysis]''';
return generateResponse(prompt);
}
Future<String> generateTestCases({required dynamic requestModel, required dynamic responseModel}) async {
final method = requestModel.httpRequestModel?.method
.toString()
.split('.')
.last
.toUpperCase()
?? "GET";
final endpoint = requestModel.httpRequestModel?.url ?? "Unknown endpoint";
final headers = requestModel.httpRequestModel?.enabledHeadersMap ?? {};
final parameters = requestModel.httpRequestModel?.enabledParamsMap ?? {};
final body = requestModel.httpRequestModel?.body;
final responsebody=responseModel.body;
final exampleParams = await generateExampleParams(
requestModel: requestModel,
responseModel: responseModel,
);
final prompt = '''
**API Request:**
- **Endpoint:** `$endpoint`
- **Method:** `$method`
- **Headers:** ${headers.isNotEmpty ? jsonEncode(headers) : "None"}
- **Parameters:** ${parameters.isNotEmpty ? jsonEncode(parameters) : "None"}
-**body:** ${body ?? "None"}
here is an example test case for the given:$exampleParams
**Instructions:**
- Generate example parameter values for the request.
-Generate the url of as i provided in the api reuest
-generate same to same type of test case url for test purpose
''';
return generateResponse(prompt);
}
Future<Map<String, dynamic>> generateExampleParams({required dynamic requestModel, required dynamic responseModel,}) async {
final ollamaService = OllamaService();
@ -130,15 +166,12 @@ Analyze the following API request and generate structured example parameters.
- **Parameters:** ${parameters.isNotEmpty ? jsonEncode(parameters) : "None"}
- **Body:** ${body ?? "None"}
**Response:**
- **Status Code:** ${responseModel?.statusCode ?? "Unknown"}
- **Response Body:** ${apiResponse != null ? jsonEncode(apiResponse) : rawResponse}
### **Required Output Format**
1. **Standard Example Values**: Assign the most appropriate example values for each parameter.
2. **Edge Cases**: Provide at least 2 edge cases per parameter.
3. **Invalid Cases**: Generate invalid inputs for error handling.
4. **Output must be in valid JSON format.**
**Instructions:**
- Generate example parameter values for the request.
-Generate the url of as i provided in the api reuest
generate same to same type of test case url for test purpose
''';
// Force LLM to return structured JSON output

View File

@ -5,28 +5,29 @@ import 'package:flutter_markdown/flutter_markdown.dart';
class ChatbotWidget extends ConsumerStatefulWidget {
const ChatbotWidget({Key? key}) : super(key: key);
const ChatbotWidget({Key? key}) : super(key: key);
@override
_ChatbotWidgetState createState() => _ChatbotWidgetState();
@override
_ChatbotWidgetState createState() => _ChatbotWidgetState();
}
class _ChatbotWidgetState extends ConsumerState<ChatbotWidget> {
final TextEditingController _controller = TextEditingController();
final List<Map<String, dynamic>> _messages = [];
bool _isLoading = false;
final TextEditingController _controller = TextEditingController();
final List<Map<String, dynamic>> _messages = [];
bool _isLoading = false;
void _sendMessage(String message) async {
if (message.trim().isEmpty) return;
final ollamaService = ref.read(ollamaServiceProvider);
final requestModel = ref.read(selectedRequestModelProvider);
final responseModel = requestModel?.httpResponseModel;
void _sendMessage(String message) async {
if (message.trim().isEmpty) return;
final ollamaService = ref.read(ollamaServiceProvider);
final requestModel = ref.read(selectedRequestModelProvider);
final responseModel = requestModel?.httpResponseModel;
setState(() {
_messages.add({'role': 'user', 'message': message});
_controller.clear();
_isLoading = true;
});
setState(() {
_messages.add({'role': 'user', 'message': message});
_controller.clear();
_isLoading = true;
});
try {
String response;
@ -35,38 +36,26 @@ class _ChatbotWidgetState extends ConsumerState<ChatbotWidget> {
requestModel: requestModel,
responseModel: responseModel,
);
} else if (message == "Debug API") {
response = await ollamaService.debugApi(
requestModel: requestModel,
responseModel: responseModel,
);
} else if (message == "Generate Test Case") {
response = await ollamaService.generateTestCases(
requestModel: requestModel,
responseModel: responseModel
);
} else {
response = await ollamaService.generateResponse(message);
}
setState(() {
_messages.add({'role': 'bot', 'message': response});
});
} catch (error) {
setState(() {
_messages.add({'role': 'bot', 'message': "Error: ${error.toString()}"});
});
} finally {
setState(() => _isLoading = false);
}
}
setState(() {
_messages.add({'role': 'bot', 'message': response});
});
} catch (error) {
setState(() {
_messages.add({'role': 'bot', 'message': "Error: ${error.toString()}"});
});
} finally {
setState(() => _isLoading = false);
}
}
@override
Widget build(BuildContext context) {
final requestModel = ref.watch(selectedRequestModelProvider);
final statusCode = requestModel?.httpResponseModel?.statusCode;
final showDebugButton = statusCode != null && statusCode >= 400;
return Container(
height: 400,
padding: const EdgeInsets.all(16),
@ -86,6 +75,7 @@ class _ChatbotWidgetState extends ConsumerState<ChatbotWidget> {
icon: const Icon(Icons.info_outline),
label: const Text("Explain API"),
),
if (showDebugButton) ...[
const SizedBox(width: 8),
ElevatedButton.icon(
@ -100,7 +90,6 @@ class _ChatbotWidgetState extends ConsumerState<ChatbotWidget> {
icon: const Icon(Icons.developer_mode),
label: const Text("Test Case"),
),
const Spacer(),
],
),
@ -148,10 +137,10 @@ class _ChatbotWidgetState extends ConsumerState<ChatbotWidget> {
}
class ChatBubble extends StatelessWidget {
final String message;
final bool isUser;
final String message;
final bool isUser;
const ChatBubble({super.key, required this.message, this.isUser = false});
const ChatBubble({super.key, required this.message, this.isUser = false});
@override
Widget build(BuildContext context) {
@ -173,4 +162,4 @@ class ChatBubble extends StatelessWidget {
),
);
}
}
}