Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@luoling8192
Copy link
Member

@luoling8192 luoling8192 commented Dec 29, 2025

depends #848

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @luoling8192, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the application's architecture by introducing a new, dedicated API server. This server centralizes authentication and provides robust CRUD functionalities for managing AI characters and their associated data models. The existing web application has been updated to seamlessly integrate with these new backend services, offering users a streamlined experience for authentication and character configuration. This foundational work paves the way for a more scalable and feature-rich platform.

Highlights

  • New API Server Introduction: A dedicated API server has been added under apps/server, complete with its own Dockerfile, Docker Compose configuration for local development, and Railway deployment settings. This server is built using Hono, a lightweight web framework.
  • Comprehensive Database Integration: Drizzle ORM is now integrated with PostgreSQL, defining schemas and migrations for core application entities. This includes authentication-related tables (users, sessions, accounts) and new data models for AI characters, their capabilities, internationalization, avatar models, chats, messages, and media.
  • Robust Authentication System: The server incorporates better-auth for user authentication, supporting social logins (Google, GitHub) and session management. Custom Hono middleware ensures secure session handling and authentication guarding across API routes.
  • Character Management API: A full RESTful API has been implemented for CRUD (Create, Read, Update, Delete) operations on AI characters. This API includes detailed input validation using Valibot for character properties, capabilities (like LLM and TTS configurations), and internationalization settings.
  • Web Application UI Updates: The apps/stage-web application has been significantly updated to consume the new API server. This includes a new login page with social authentication options, a user avatar dropdown for managing authentication actions, and a dedicated 'Characters' settings page for users to create, view, edit, and delete their AI characters.
  • Monorepo Dependency Refactoring: The pnpm-workspace.yaml and various package.json files have been updated to reflect the new project structure and dependencies, ensuring proper management of packages like better-auth, hono, drizzle-orm, valibot, and injeca across the monorepo.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@luoling8192 luoling8192 changed the title feat(server): with api server, service-lize (#807) feat(stage-ui): character settings page Dec 29, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new API server for the application, complete with database schemas, services, authentication, and API routes. It also adds corresponding frontend components for user authentication and character management. The changes are extensive and well-structured, using dependency injection, service-oriented architecture, and clear separation of concerns. However, there are several critical and high-severity issues that need to be addressed, particularly in the Docker configuration, dependency versions, database connection handling, and security settings. There are also opportunities to improve data integrity in the database schema and reduce code duplication between the frontend and backend. One security-related comment has been updated to reference a rule regarding environment variable validation on startup.

timeout: 5s
retries: 10
volumes:
- ${serviceName}_data:/var/lib/postgresql/data
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The volume for the db service is defined as ${serviceName}_data, but the named volume at the root of the file is db_data. The serviceName variable is not defined in this file's context, so this will result in a volume named _data being created, and the db_data volume will be unused. This will cause the database to lose all its data if the container is removed and recreated.

      - db_data:/var/lib/postgresql/data

Comment on lines +19 to +29
"drizzle-orm": "^0.44.7",
"drizzle-valibot": "catalog:",
"hono": "^4.10.7",
"injeca": "catalog:",
"postgres": "^3.4.7",
"tsx": "^4.21.0",
"valibot": "catalog:"
},
"devDependencies": {
"@better-auth/cli": "^1.4.5",
"drizzle-kit": "^0.31.7"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The versions specified for drizzle-orm (^0.44.7) and drizzle-kit (^0.31.7) do not appear to exist in the public npm registry. This will likely cause the pnpm install command to fail for anyone without access to a private registry where these versions might be hosted. Please verify that these versions are correct. If they are typos, they should be corrected to valid, publicly available versions.

Comment on lines +20 to +27
auth = {
$Infer: {
Session: {
user: {},
session: {},
},
},
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The auth variable is assigned a value here but it is never declared within the scope of the test suite. This will cause a ReferenceError when running the tests. It should be declared with let auth: any; at the beginning of the describe block.

Comment on lines +31 to +40
build: ({ dependsOn }) => {
const dbInstance = createDrizzle(dependsOn.env.DATABASE_URL, schema)
dbInstance.execute('SELECT 1')
.then(() => logger.log('Connected to database'))
.catch((err) => {
logger.withError(err).error('Failed to connect to database')
exit(1)
})
return dbInstance
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The database connection check is performed asynchronously with .then() and .catch(), but the build function for the services:db provider is synchronous and returns immediately. This creates a race condition where the application might start handling requests before the database connection is confirmed to be healthy. The build function should be async and await the connection check to ensure the database is ready before the service is considered built.

    build: async ({ dependsOn }) => {
      const dbInstance = createDrizzle(dependsOn.env.DATABASE_URL, schema)
      try {
        await dbInstance.execute('SELECT 1')
        logger.log('Connected to database')
      } catch (err) {
        logger.withError(err).error('Failed to connect to database')
        exit(1)
      }
      return dbInstance
    },

Comment on lines +24 to +55
updatedAt: timestamp('updated_at')
.$onUpdate(() => /* @__PURE__ */ new Date())
.notNull(),
ipAddress: text('ip_address'),
userAgent: text('user_agent'),
userId: text('user_id')
.notNull()
.references(() => user.id, { onDelete: 'cascade' }),
},
table => [index('session_userId_idx').on(table.userId)],
)

export const account = pgTable(
'account',
{
id: text('id').primaryKey(),
accountId: text('account_id').notNull(),
providerId: text('provider_id').notNull(),
userId: text('user_id')
.notNull()
.references(() => user.id, { onDelete: 'cascade' }),
accessToken: text('access_token'),
refreshToken: text('refresh_token'),
idToken: text('id_token'),
accessTokenExpiresAt: timestamp('access_token_expires_at'),
refreshTokenExpiresAt: timestamp('refresh_token_expires_at'),
scope: text('scope'),
password: text('password'),
createdAt: timestamp('created_at').defaultNow().notNull(),
updatedAt: timestamp('updated_at')
.$onUpdate(() => /* @__PURE__ */ new Date())
.notNull(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The updatedAt columns in the session and account tables are defined as NOT NULL and have an $onUpdate hook, but they are missing a .defaultNow() call. This means that on INSERT, if a value for updatedAt is not explicitly provided, the database will try to insert NULL, which violates the NOT NULL constraint. You should add .defaultNow() to these columns for consistency with other tables and to prevent insertion errors.

},

baseURL: process.env.API_SERVER_URL || 'http://localhost:3000',
trustedOrigins: ['*'],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Setting trustedOrigins: ['*'] disables CSRF protection provided by better-auth by allowing any origin. This is a significant security risk. While the CORS middleware in app.ts is configured correctly with getTrustedOrigin, this better-auth setting should also be made secure. You should use a specific list of trusted origins, preferably from an environment variable, and ensure a fallback is provided to prevent issues from undefined values.

Suggested change
trustedOrigins: ['*'],
trustedOrigins: [process.env.WEB_URL || 'http://localhost:5173'],
References
  1. Validate environment variables on startup to ensure all required variables are present, preventing runtime errors from undefined values.

Comment on lines +30 to +98
updatedAt: timestamp('updated_at').defaultNow().notNull(),
deletedAt: timestamp('deleted_at'),
},
)

export type Character = InferSelectModel<typeof character>
export type NewCharacter = InferInsertModel<typeof character>

export const avatarModel = pgTable(
'avatar_model',
{
id: text('id').primaryKey().$defaultFn(() => nanoid()),
characterId: text('character_id').notNull().references(() => character.id, { onDelete: 'cascade' }),
name: text('name').notNull(),
type: text('type').notNull().$type<keyof AvatarModelConfig>(),

description: text('description').notNull(),

config: jsonb('config').notNull().$type<AvatarModelConfig[keyof AvatarModelConfig]>(),
createdAt: timestamp('created_at').defaultNow().notNull(),
updatedAt: timestamp('updated_at').defaultNow().notNull(),
deletedAt: timestamp('deleted_at'),
},
)

export type AvatarModel = InferSelectModel<typeof avatarModel>
export type NewAvatarModel = InferInsertModel<typeof avatarModel>

export const characterCapabilities = pgTable(
'character_capabilities',
{
id: text('id').primaryKey().$defaultFn(() => nanoid()),
characterId: text('character_id').notNull().references(() => character.id, { onDelete: 'cascade' }),

type: text('type').notNull().$type<keyof CharacterCapabilityConfig>(),

config: jsonb('config').notNull().$type<CharacterCapabilityConfig[keyof CharacterCapabilityConfig]>(),
},
)

export type CharacterCapability = InferSelectModel<typeof characterCapabilities>
export type NewCharacterCapability = InferInsertModel<typeof characterCapabilities>

export const characterI18n = pgTable(
'character_i18n',
{
id: text('id').primaryKey().$defaultFn(() => nanoid()),
characterId: text('character_id').notNull().references(() => character.id, { onDelete: 'cascade' }),

language: text('language').notNull(),

name: text('name').notNull(),
description: text('description').notNull(),
tags: text('tags').array().notNull(),

// TODO: Implement the system prompt
// systemPrompt: text('system_prompt').notNull(),
// TODO: Implement the personality
// personality: text('personality').notNull(),

// TODO: Implement the initial memories
// initialMemories: text('initial_memories').array().notNull(),

// TODO: greetings?
// TODO: notes?
// TODO: metadata?

createdAt: timestamp('created_at').defaultNow().notNull(),
updatedAt: timestamp('updated_at').defaultNow().notNull(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The updatedAt columns in the character, avatarModel, and characterI18n tables are missing the $onUpdate hook. While the service layer currently handles updating this field manually, it's best practice to define this behavior at the schema level for consistency and to prevent accidental omissions in the future. Please add .$onUpdate(() => new Date()) to these columns.

Comment on lines +75 to +76
replyToMessageId: text('reply_message_id'),
forwardFromMessageId: text('forward_from_message_id'),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The replyToMessageId and forwardFromMessageId columns in the messages table are defined as text but seem to reference another message's ID. For data integrity, these should be defined as foreign keys referencing messages.id. This would prevent dangling references where a message refers to a reply/forward that no longer exists.

Suggested change
replyToMessageId: text('reply_message_id'),
forwardFromMessageId: text('forward_from_message_id'),
replyToMessageId: text('reply_message_id').references(() => messages.id),
forwardFromMessageId: text('forward_from_message_id').references(() => messages.id),

Comment on lines +1 to +168
import type { InferOutput } from 'valibot'

import { array, date, literal, number, object, optional, pipe, string, transform, union } from 'valibot'

// --- Enums & Configs ---

export const AvatarModelConfigSchema = object({
vrm: optional(object({
urls: array(string()),
})),
live2d: optional(object({
urls: array(string()),
})),
})

export const CharacterCapabilityConfigSchema = object({
apiKey: string(),
apiBaseUrl: string(),
llm: optional(object({
temperature: number(),
model: string(),
})),
tts: optional(object({
ssml: string(),
voiceId: string(),
speed: number(),
pitch: number(),
})),
vlm: optional(object({
image: string(),
})),
asr: optional(object({
audio: string(),
})),
})

const CharacterCapabilityTypeSchema = union([
literal('llm'),
literal('tts'),
literal('vlm'),
literal('asr'),
])

const AvatarModelTypeSchema = union([
literal('vrm'),
literal('live2d'),
])

const PromptTypeSchema = union([
literal('system'),
literal('personality'),
literal('greetings'),
])

const DateSchema = pipe(
union([string(), date()]),
transform(v => new Date(v)),
)

// --- Base Entities (mimicking database tables) ---

export const CharacterBaseSchema = object({
id: string(),
version: string(),
coverUrl: string(),
creatorId: string(),
ownerId: string(),
characterId: string(),
createdAt: DateSchema,
updatedAt: DateSchema,
})

export const CharacterCapabilitySchema = object({
id: string(),
characterId: string(),
type: CharacterCapabilityTypeSchema,
config: CharacterCapabilityConfigSchema,
})

export const AvatarModelSchema = object({
id: string(),
characterId: string(),
name: string(),
type: AvatarModelTypeSchema,
description: string(),
config: AvatarModelConfigSchema,
createdAt: DateSchema,
updatedAt: DateSchema,
})

export const CharacterI18nSchema = object({
id: string(),
characterId: string(),
language: string(),
name: string(),
description: string(),
tags: array(string()),
createdAt: DateSchema,
updatedAt: DateSchema,
})

export const CharacterPromptSchema = object({
id: string(),
characterId: string(),
language: string(),
type: PromptTypeSchema,
content: string(),
})

// --- Aggregated Character (with relations) ---

export const CharacterWithRelationsSchema = object({
...CharacterBaseSchema.entries,
capabilities: array(CharacterCapabilitySchema),
avatarModels: array(AvatarModelSchema),
i18n: array(CharacterI18nSchema),
prompts: array(CharacterPromptSchema),
})

// --- API Request Schemas ---

export const CreateCharacterSchema = object({
character: object({
version: string(),
coverUrl: string(),
characterId: string(),
// creatorId & ownerId are handled by server
}),
capabilities: optional(array(object({
type: CharacterCapabilityTypeSchema,
config: CharacterCapabilityConfigSchema,
}))),
avatarModels: optional(array(object({
name: string(),
type: AvatarModelTypeSchema,
description: string(),
config: AvatarModelConfigSchema,
}))),
i18n: optional(array(object({
language: string(),
name: string(),
description: string(),
tags: array(string()),
}))),
prompts: optional(array(object({
language: string(),
type: PromptTypeSchema,
content: string(),
}))),
})

export const UpdateCharacterSchema = object({
version: optional(string()),
coverUrl: optional(string()),
characterId: optional(string()),
})

// --- Type Exports ---

export type Character = InferOutput<typeof CharacterWithRelationsSchema>
export type CharacterBase = InferOutput<typeof CharacterBaseSchema>
export type CharacterCapability = InferOutput<typeof CharacterCapabilitySchema>
export type AvatarModel = InferOutput<typeof AvatarModelSchema>
export type CharacterI18n = InferOutput<typeof CharacterI18nSchema>
export type CharacterPrompt = InferOutput<typeof CharacterPromptSchema>

export type CreateCharacterPayload = InferOutput<typeof CreateCharacterSchema>
export type UpdateCharacterPayload = InferOutput<typeof UpdateCharacterSchema>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file duplicates a significant amount of schema and type definitions that also exist on the backend in apps/server/src/api/characters.schema.ts. This code duplication can lead to inconsistencies and maintenance overhead. Consider creating a shared package (e.g., packages/shared-types) within the monorepo to house these common types so they can be imported by both the frontend and backend.

@github-actions
Copy link
Contributor

⏳ Approval required for deploying to Cloudflare Workers (Preview) for stage-web.

Name Link
🔭 Waiting for approval For maintainers, approve here

Hey, @nekomeowww, @sumimakito, @luoling8192, @LemonNekoGH, kindly take some time to review and approve this deployment when you are available. Thank you! 🙏

@luoling8192 luoling8192 merged commit 3aff4c3 into main Dec 29, 2025
7 of 9 checks passed
@luoling8192 luoling8192 deleted the dev/store branch December 29, 2025 16:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants