Replies: 6 comments 7 replies
-
Looks good. I'm still trying to think how i should implement the ollama API, id like it if the user was able to control model settings like seed, temp, etc from the app without messing with it on the computer / server |
Beta Was this translation helpful? Give feedback.
-
It makes sense to prioritize improving the app's stability |
Beta Was this translation helpful? Give feedback.
-
I'm writing this just to inform you in case you're not already aware. Environment="OLLAMA_HOST=0.0.0.0:8000" This modification allows Ollama to listen on all network interfaces (0.0.0.0), enabling sharing within your local environment, so you can test it on mobile too |
Beta Was this translation helpful? Give feedback.
-
Yeah thats good. Currently i have it working in a very basic manner, no model params are passed yet and the example prompts cant be used yet. |
Beta Was this translation helpful? Give feedback.
-
I came across the suggestion to make api models configurable in the comments and shared an example in an attempt to assist. You can find the example here: main...sfiannaca:Maid:dev-api. However, I hadn't noticed that you had already added the switch inside the drawer, which is also a good way. |
Beta Was this translation helpful? Give feedback.
-
I'm waiting for ollama/ollama#991 to be pushed through to do a proper release with Ollama support. |
Beta Was this translation helpful? Give feedback.
-
Just for fun, I played with HTTP calls and crafted this raw and rough Dart file to access the Ollama API for retrieving models and play with the Stream Chat API.
//Dart File
import 'package:http/http.dart' as http;
import 'dart:async';
import 'dart:convert';
class ApiConfig {
String type;
String model;
bool enabled;
String urlGen;
String urlModels;
ApiConfig({
required this.type,
required this.model,
required this.enabled,
required this.urlGen,
required this.urlModels,
});
}
ApiConfig apiConfig = ApiConfig(
type: "OLLAMA",
model: "llama2",
enabled: false,
urlGen: "http://192.168.1.102:8000/api/generate", // CHANGE IP AND PORT
urlModels: "http://192.168.1.102:8000/api/tags", // CHANGE IP AND PORT
);
void main() async {
gen(String prompt) async* {
final request = http.Request('POST', Uri.parse(apiConfig.urlGen));
request.headers['Content-Type'] = 'application/json';
request.body = jsonEncode({"model": apiConfig.model, "prompt": prompt});
final response = await http.Client().send(request);
}
getModels()async{
final response = await http.get(Uri.parse(apiConfig.urlModels));
return jsonDecode(response.body);
}
final models = await getModels();
print(" Models " + models.toString());
final streamController = StreamController<Map<String, dynamic>>();
final apiResponseStream = gen("Why is the sky blue?");
apiResponseStream.listen(
(data) {
print('Received data: $data');
},
onError: (error) {
print("Error: $error");
},
onDone: () {
print("API request completed.");
streamController.close();
},
);
await streamController.stream.drain();
print("All data processed.");
}
Also, there is another third-party library that can be used: github.com/davidmigloz/langchain_dart
Beta Was this translation helpful? Give feedback.
All reactions