Skip to content

preternatural-explore/swift-code-generator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

Important

Created by Preternatural AI, an exhaustive client-side AI infrastructure for Swift.
This project and the frameworks used are presently in alpha stage of development.

Swift Code Generator: Generate Sample Code on Demand

The Swift Code Generator is a demo app that allows users to input natural language descriptions of desired Swift functionality and receive corresponding Swift code snippets. The app demonstrates the use of a clever prompting technique: leveraging the Claude 3.5 Sonnet model, it initiates the assistant's response with a code block marker and employs a stop sequence to ensure only the relevant code is returned. This method effectively filters out explanatory text, focusing solely on the generated Swift code - a technique which can be further applied to other use-cases.

MIT License

Table of Contents

Usage

Supported Platforms

macos   ios   ipados  

To install and run the SwiftCodeGenerator app:

  1. Download and open the project in Xcode
  2. Enter your Anthropic API Key in LLMManager
// LLMManager.swift
static let client = Anthropic.Client(apiKey: "YOUR API KEY")

You can get the Anthropic key on the Anthropic developer website. Note that you have to set up billing and add a small amount of money for the API calls to work (this will cost you less than 1 dollar).

  1. Run the project on the Mac, iPad, or iPhone
  2. Type in which Swift code you would like generated. For example, here is the response of "make a green button"
smartcode

Key Concepts

The Swift Code Generator app is developed to demonstrate how to work with Anthropic's LLM completions API to include the start and end of the desired result output.

Preternatural Frameworks

The following Preternatural Frameworks were used in this project:

  • AI: The definitive, open-source Swift framework for interfacing with generative AI.

Technical Specifications

Large Language Models (LLMs) in their current form are notorious for being too verbose… For example, when Claude is asked a direct question - such as “Who was the first US president”, it gives this full response instead of the simple “George Washington” answer as expected.

Screenshot 2024-08-24 at 11 50 43 AM

This presents a problem for us as developers. We might want to use an LLM for direct responses such as “George Washington” but instead have to work around these big long verbose responses. The Swift Code Generator is a demonstration using one strategy for generating Swift code in response to a user request.

As seen in the screenshot - the goal is to generating ONLY SWIFT CODE and none of the verbosity around it that is usually present in Claude as seen in this response for the same query:

Screenshot 2024-08-24 at 11 56 59 AM

So how do we approach this?

The first step is to write the system and user prompt with the basic instructions. You can check the LLMManager file for the full implementation:

// LLMManager.swift

let systemPrompt: PromptLiteral = """
You are a Swift code generation AI. Your sole purpose is to produce Swift code in response to user requests. Adhere to these guidelines:
1. Generate only Swift code.
2. Ensure the code is complete, correct, and follows Swift best practices.
3. Include necessary import statements.
4. Use the latest Swift syntax and idioms.
5. Optimize for clarity and efficiency.
"""

let userPrompt: PromptLiteral = """
Generate Swift code for the following task:

\(userInput)
"""

Although not explicitly seen in this specific Claude answer, the LLM is trained on many Markdown files. So we can image that it will easily know Markdown syntax for swift code as:

```swift
SWIFT CODE
```

Some LLMs, including Claude, allow to provide the start of the assistant reply. In this case, we can prompt Claude to start the response to our user prompt with the markdown:

let assistantStart: PromptLiteral = """
```swift
"""

let messages: [AbstractLLM.ChatMessage] = [
    .system(systemPrompt),
    .user(userPrompt),
    .assistant(assistantStart)
]

Now, instead of answering the user query with something like “Certainly, I can help you create a purple button in Swift. Here's a simple implementation using SwiftUI:”, it will directly start by completing the markdown, which forces Claude to write Swift code right away.

However, Claude can still keep writing explanations of the code after it finishes the Swift code. In this case, we can use a “stop sequence” “```” to tell Claude when to STOP generating anything further:

let parameters = AbstractLLM.ChatCompletionParameters(
        tokenLimit: nil,
        temperature: nil,
        stops: ["```"], // the stop sequence is the end of the markdown "```"
        functions: nil)

Now Claude is forced to start its Assistant Reply with “swift” and end generating anything after “” - this ensures that only Swift code is generated an no verbosity around it! Note that the stop sequence itself is NOT included in the Claude reply - it is automatically stripped out. So the final response will only be:

```swift
SWIFT CODE

We therefore still do have to remove the “```swift” part of the code:

let code: String = try await client.complete(
    messages,
    parameters: parameters,
    model: model,
    as: .string)

let trimmedCode = code.trimmingCharacters(in: .whitespacesAndNewlines)
let processedCode = trimmedCode.replacingOccurrences(of: "```swift\n", with: "")

We can now successfully process the Swift code string as an Attributed String and display it in our app. The same technique can be used for other use-cases where the start and end point of the desired generated LLM response is known.

Acknowledgements

ExgternalFrameworkName

License

This package is licensed under the MIT License.

About

A demo app for generating Swift code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages