NTUOSS Flutter Workshop

By Zayn Jarvis from NTU Open Source Society

Artwork by Chuan Bin


Workshop Details Β 
When Friday, 26 Oct 2018. 6:30 PM - 8:30 PM
Where LT1, NTU North Spine Plaza
Who NTU Open Source Society
Questions We will be hosting a Pigeon Hole Live for collecting questions regarding the workshop

Feedback & Error Reports: We will send out the link for collecting feedback as usual. ​For further discussion or cooperation please contact zaynjarvis@gmail.com.

Disclaimer: This workshop is for educational purposes only. The redux framework is forked from brianegan/flutter_architecture_samples. And information regarding Unity concepts are retrieved from Flutter Document. No prototype or outcome of any type is intended for commercial use.


Setup


Agenda


Live Streaming:

NTUOSS-FlutterWorkshop

Flutter

use git checkout <branch-name> to get the source code of different apps


HELLO WORLD

# > To view the source code.
$ git checkout feature/currencyapp

With Flutter installed, Check everything is correct with

$ flutter doctor
Doctor summary (to see all details, run flutter doctor -v):
[βœ“] Flutter (Channel beta, v0.9.4, on Mac OS X 10.14 18A391, locale en-SG)
[βœ“] Android toolchain - develop for Android devices (Android SDK 28.0.1)
[βœ“] iOS toolchain - develop for iOS devices (Xcode 10.0)
[βœ“] Android Studio (version 3.1)
[βœ“] IntelliJ IDEA Community Edition (version 2018.2)
[βœ“] VS Code (version 1.28.2)
[βœ“] Connected devices (1 available)

β€’ No issues found!

If you do not have connected devices error, connect your phone to computer (For iOS developer make sure your device has trusted your computer to install app on in. Read here.)

You can connect a simulator(Xcode) or an emulator(Android Studio) for development as well.

Then we can get started by running

flutter create myapp

This command will create a flutter project folder for you named myapp

Then we can change directory into our project folder and start to run the project

$ cd myapp
$ flutter run

Now you should have a counter app running in your device.


CURRENCY

$ # > To view the source code.
$ git checkout feature/currencyapp
$ git log --oneline
e5c3b21 integrate with redux framework
d097c30 local database integration
35dc753 slidable component with delete and rebase function
34bf7f7 add price controller
67e5f8a http request for country exchange rate
40c264c currency app layout

TODO

$ # > To view the source code.
$ cd .. # Go back to the parent directory
$ git clone https://github.com/brianegan/flutter_architecture_samples.git
$ cd flutter_architecture_samples/example/firestore_redux
$ code . # if you have VScode installed

''


VOICE RECOGNITION

We can make use of MethodChannel in both platform.

Checkout the repository here

In Flutter (Dart), we create the channel we will use. In this case, a generic name speech_recognizer. Then we specify the function to integrate with native APIs.

import 'dart:async';

import 'package:flutter/services.dart';

const MethodChannel _speech_channel =
    const MethodChannel("speech_recognizer");

class SpeechRecognizer {
  static void setMethodCallHandler(handler) {
    _speech_channel.setMethodCallHandler(handler);
  }

  static Future activate() {
    return _speech_channel.invokeMethod("activate");
  }

  static Future start(String lang) {
    return _speech_channel.invokeMethod("start", lang);
  }

  static Future cancel() {
    return _speech_channel.invokeMethod("cancel");
  }

  static Future stop() {
    return _speech_channel.invokeMethod("stop");
  }
}

In Android (Java), we connect to the channel speech_recognizer to receive message. Then deal with the method call received from Flutter.

public class MainActivity extends FlutterActivity implements RecognitionListener {

    private static final String SPEECH_CHANNEL = "speech_recognizer";
    private static final String LOG_TAG = "SPEAKTEST";
    private SpeechRecognizer speech;
    private MethodChannel speechChannel;
    String transcription = "";
    private boolean cancelled = false;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        GeneratedPluginRegistrant.registerWith(this);

        speech = SpeechRecognizer.createSpeechRecognizer(getApplicationContext());
        speech.setRecognitionListener(this);

        final Intent recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
        recognizerIntent.putExtra(RecognizerIntent.EXTRA_PARTIAL_RESULTS, true);
        recognizerIntent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 3);

        speechChannel = new MethodChannel(getFlutterView(), SPEECH_CHANNEL);
        speechChannel.setMethodCallHandler(
                new MethodChannel.MethodCallHandler() {
                    @Override
                    public void onMethodCall(MethodCall call, MethodChannel.Result result) {
                        switch(call.method){
                            case "activate":
                                result.success(true);
                                break;
                            case "start":
                                cancelled = false;
                                speech.startListening(recognizerIntent);
                                result.success(true);
                                break;
                            case "cancel":
                                speech.stopListening();
                                cancelled = true;
                                result.success(true);
                                break;
                            case "stop":
                                speech.stopListening();
                                cancelled = false;
                                result.success(true);
                                break;
                            default:
                                result.notImplemented();
                        }
                    }
                }
        );
    }
}
import UIKit
import Flutter
import Speech

@UIApplicationMain
class AppDelegate: FlutterAppDelegate, SFSpeechRecognizerDelegate {

  private let speechRecognizerEn = SFSpeechRecognizer(locale: Locale(identifier: "en_US"))!

  private var speechChannel: FlutterMethodChannel?

  private var recognitionRequest: SFSpeechAudioBufferRecognitionRequest?

  private var recognitionTask: SFSpeechRecognitionTask?

  private let audioEngine = AVAudioEngine()

  override func application(
     _ application: UIApplication,
     didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {

    let controller: FlutterViewController = window?.rootViewController as! FlutterViewController

    speechChannel = FlutterMethodChannel.init(name: "speech_recognizer",
       binaryMessenger: controller)
    speechChannel!.setMethodCallHandler({
      (call: FlutterMethodCall, result: @escaping FlutterResult) -> Void in
      if ("start" == call.method) {
        self.startRecognition(lang: call.arguments as! String, result: result)
      } else if ("stop" == call.method) {
        self.stopRecognition(result: result)
      } else if ("cancel" == call.method) {
        self.cancelRecognition(result: result)
      } else if ("activate" == call.method) {
        self.activateRecognition(result: result)
      } else {
        result(FlutterMethodNotImplemented)
      }
    })
    return true
  }
}