-
Notifications
You must be signed in to change notification settings - Fork 434
feat: support upload client log #8738
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: x
Are you sure you want to change the base?
Conversation
✅ Snyk checks have passed. No issues have been found so far.
💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse. |
… content rendering
…ncing instance ID handling
WalkthroughThis PR adds end-to-end log collection and upload capability. It introduces new IPC channels, desktop and native APIs for gathering log digests and uploading bundles, a service-layer token request flow, event bus events for progress tracking, and a dialog UI with retry logic and real-time status updates. Changes
Sequence Diagram(s)sequenceDiagram
actor User
participant UI as Log Upload Dialog
participant Logic as Export Logic
participant Service as ServiceLogger
participant API as Desktop API
participant Server as Upload Server
User->>UI: Trigger upload
UI->>Logic: collectLogDigest()
Logic->>API: Gather logs, zip, hash
API-->>Logic: digest metadata
UI->>Service: requestUploadToken(digest)
Service-->>UI: uploadToken + expiresAt
UI->>Logic: uploadLogBundle(token, digest)
Logic->>API: Stream file to server
API->>Server: POST with progress
Server-->>API: Response
API-->>Logic: Result
Logic-->>UI: Success/Error
rect rgb(200, 220, 255)
Note over UI: Emit progress events via<br/>ClientLogUploadProgress
end
alt Upload succeeds
UI->>User: Show success + instance ID
else Upload fails
rect rgb(255, 200, 200)
UI->>UI: Retry with backoff
Note over UI: Fallback to export<br/>on final failure
end
UI->>User: Show error + option to export
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes The PR spans multiple layers (IPC, desktop API, service, event bus, UI) with platform-specific implementations. While individual changes follow consistent patterns, reviewers must verify logic density in the dialog retry flow, error propagation across layers, type alignment across the event bus and logger modules, and platform-specific upload handling (Expo vs. fallback fetch). The coordination between service, API, and UI layers adds integration complexity. Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
686f9e6
to
2cd5a1d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 15
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge base: Disabled due to data retention organization setting
Disabled knowledge base sources:
- Jira integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (14)
apps/desktop/app/config.ts
(1 hunks)apps/desktop/app/preload.ts
(1 hunks)packages/kit-bg/src/desktopApis/DesktopApiDev.ts
(2 hunks)packages/kit-bg/src/services/ServiceLogger.ts
(2 hunks)packages/kit/src/views/Developer/pages/Gallery/Components/stories/Logger.tsx
(5 hunks)packages/kit/src/views/Setting/pages/Tab/config.tsx
(3 hunks)packages/kit/src/views/Setting/pages/Tab/exportLogs/index.desktop.ts
(1 hunks)packages/kit/src/views/Setting/pages/Tab/exportLogs/index.native.ts
(2 hunks)packages/kit/src/views/Setting/pages/Tab/exportLogs/index.ts
(1 hunks)packages/kit/src/views/Setting/pages/Tab/exportLogs/showExportLogsDialog.tsx
(1 hunks)packages/kit/src/views/Setting/pages/Tab/exportLogs/utils.ts
(1 hunks)packages/shared/src/eventBus/appEventBus.ts
(2 hunks)packages/shared/src/eventBus/appEventBusNames.ts
(1 hunks)packages/shared/src/logger/types.ts
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: lint (20.x)
- GitHub Check: unittest (20.x)
- GitHub Check: Analyze (javascript-typescript)
🔇 Additional comments (15)
packages/shared/src/eventBus/appEventBusNames.ts (1)
121-121
: LGTM! Event name is clear and consistent.The new event name follows the existing naming pattern and is placed appropriately at the end of the enum.
packages/kit-bg/src/services/ServiceLogger.ts (2)
5-8
: LGTM! Imports support the new token request method.The new imports are necessary for error handling, endpoint configuration, and type safety.
38-48
: LGTM! Token request implementation looks solid.The method validates input, fetches a wallet client, and returns the token response. Error handling for empty bundles is appropriate.
packages/shared/src/logger/types.ts (1)
68-101
: LGTM! Type definitions are clear and well-structured.The discriminated union for
ILogBundle
provides type safety, and the enum covers all upload stages. The types align well with the upload workflow.packages/kit-bg/src/desktopApis/DesktopApiDev.ts (1)
85-217
: LGTM! Upload implementation with progress tracking is comprehensive.The method handles upload with real-time progress updates, detailed logging, and proper error handling. The type cast at line 174 is acceptable given node-fetch's stream handling requirements.
apps/desktop/app/preload.ts (1)
132-132
: LGTM! IPC channel registration is correct.Adding the channel to
validChannels
enables the preload script to listen for upload progress events.apps/desktop/app/config.ts (1)
44-44
: LGTM! IPC key follows naming conventions.The key name and value are consistent with other IPC message keys in the configuration.
packages/kit/src/views/Setting/pages/Tab/exportLogs/utils.ts (1)
1-2
: LGTM! Filename generation is simple and effective.The function creates a timestamped filename that's filesystem-friendly. The resulting format (e.g., "OneKeyLogs-20251022T171925123Z") is clear and sortable.
packages/shared/src/eventBus/appEventBus.ts (2)
18-18
: LGTM! Import supports the new event payload.The type import is necessary for the ClientLogUploadProgress payload definition.
390-395
: LGTM! Event payload structure is well-designed.The payload includes required stage tracking and optional fields for progress percentage, retry count, and error messages. This provides flexibility across different upload stages.
packages/kit/src/views/Setting/pages/Tab/config.tsx (1)
716-723
: Nice simplification.Cleaner entry point. No issues from me.
packages/kit/src/views/Setting/pages/Tab/exportLogs/index.native.ts (1)
189-209
: Confirm server accepts multipart fallback.Desktop path posts raw binary; the native fallback uses multipart/form-data with “file”. Verify backend supports both.
Also applies to: 224-231
packages/kit/src/views/Setting/pages/Tab/exportLogs/index.desktop.ts (1)
77-81
: <!-- [scratchpad_start]
[task final_determination]
- Review comment asks: "Confirm whether the bridge normalizes header keys"
- Evidence shows:
- Bridge does NOT normalize (no toLowerCase/normalize calls found)
- node-fetch client DOES normalize to lowercase before sending
- Current code uses lowercase keys consistently
- No duplicate header risk
- HTTP/2 spec requires lowercase (result 5-8)
- HTTP/1.1 spec allows lowercase (result 5-7)
[conclusion]
- Original concern: Should we use "Authorization" instead of "authorization"?
- Finding: Bridge doesn't normalize, but HTTP client does
- Impact: Code is functionally correct - node-fetch handles normalization
- However: Following HTTP conventions (e.g., "Authorization") is better practice
- Severity: Not critical - code works as-is due to node-fetch's normalization
- Tag: This is a style/convention issue, not a bug
[decision]
- The review comment is overly cautious for this codebase
- Code is safe because node-fetch normalizes headers
- Can acknowledge valid concern but note it's handled by HTTP client
- Suggest best practice (conventional casing) but not mandatory
[scratchpad_start] -->No action needed. The bridge is safe as implemented.
node-fetch lowercases all headers before sending them, so regardless of the casing used in the code at lines 77-81, the HTTP request will standardize them.
Current setup:
- Line 79 sets lowercase
authorization
- Line 96 checks for lowercase
content-length
(no duplicate risk)- HTTP/2 header names must be lowercase, so the normalization aligns with modern standards
The bridge passes headers as a plain object to
fetch()
(line 171), allowing the HTTP client to handle normalization transparently. No header casing issues exist here.packages/kit/src/views/Setting/pages/Tab/exportLogs/index.ts (2)
34-34
: Clarify the 1-second wait.Why wait 1 second here? If it's for device info logging to complete, add a comment. If unnecessary, remove it.
74-135
: LGTM!Solid implementation with:
- Comprehensive input validation
- Progress tracking via event bus
- Proper error handling with event emission
- Authorization header for secure upload
The rejection of empty logs (line 84) is appropriate.
async collectLoggerDigest(params: { fileBaseName: string }): Promise<{ | ||
filePath: string; | ||
fileName: string; | ||
mimeType: string; | ||
sizeBytes: number; | ||
sha256: string; | ||
}> { | ||
if (!params.fileBaseName) { | ||
throw new OneKeyLocalError('fileBaseName is required'); | ||
} | ||
const baseName = params.fileBaseName; | ||
const logFilePath = logger.transports.file.getFile().path; | ||
const logDir = path.dirname(logFilePath); | ||
const logFiles = await fsPromises.readdir(logDir); | ||
|
||
const zipName = `${baseName}.zip`; | ||
const tempDir = path.join(app.getPath('temp'), '@onekeyhq-desktop-logs'); | ||
await fsPromises.mkdir(tempDir, { recursive: true }); | ||
const zipPath = path.join(tempDir, zipName); | ||
|
||
const zip = new AdmZip(); | ||
logFiles | ||
.filter((fileName) => fileName.endsWith('.log')) | ||
.forEach((fileName) => { | ||
zip.addLocalFile(path.join(logDir, fileName), '', fileName); | ||
}); | ||
zip.writeZip(zipPath); | ||
|
||
const fileBuffer = await fsPromises.readFile(zipPath); | ||
const sizeBytes = fileBuffer.length; | ||
const sha256Hex = createHash('sha256').update(fileBuffer).digest('hex'); | ||
|
||
return { | ||
filePath: zipPath, | ||
fileName: zipName, | ||
mimeType: 'application/zip', | ||
sizeBytes, | ||
sha256: sha256Hex, | ||
}; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Consider validating non-empty log collection.
If no .log
files exist in the directory, the method creates an empty zip. Consider checking that at least one log file was added before returning, or throwing a more specific error.
Example validation after the filter-and-add loop:
zip.addLocalFile(path.join(logDir, fileName), '', fileName);
});
+if (zip.getEntries().length === 0) {
+ throw new OneKeyLocalError('No log files found to collect');
+}
zip.writeZip(zipPath);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async collectLoggerDigest(params: { fileBaseName: string }): Promise<{ | |
filePath: string; | |
fileName: string; | |
mimeType: string; | |
sizeBytes: number; | |
sha256: string; | |
}> { | |
if (!params.fileBaseName) { | |
throw new OneKeyLocalError('fileBaseName is required'); | |
} | |
const baseName = params.fileBaseName; | |
const logFilePath = logger.transports.file.getFile().path; | |
const logDir = path.dirname(logFilePath); | |
const logFiles = await fsPromises.readdir(logDir); | |
const zipName = `${baseName}.zip`; | |
const tempDir = path.join(app.getPath('temp'), '@onekeyhq-desktop-logs'); | |
await fsPromises.mkdir(tempDir, { recursive: true }); | |
const zipPath = path.join(tempDir, zipName); | |
const zip = new AdmZip(); | |
logFiles | |
.filter((fileName) => fileName.endsWith('.log')) | |
.forEach((fileName) => { | |
zip.addLocalFile(path.join(logDir, fileName), '', fileName); | |
}); | |
zip.writeZip(zipPath); | |
const fileBuffer = await fsPromises.readFile(zipPath); | |
const sizeBytes = fileBuffer.length; | |
const sha256Hex = createHash('sha256').update(fileBuffer).digest('hex'); | |
return { | |
filePath: zipPath, | |
fileName: zipName, | |
mimeType: 'application/zip', | |
sizeBytes, | |
sha256: sha256Hex, | |
}; | |
} | |
async collectLoggerDigest(params: { fileBaseName: string }): Promise<{ | |
filePath: string; | |
fileName: string; | |
mimeType: string; | |
sizeBytes: number; | |
sha256: string; | |
}> { | |
if (!params.fileBaseName) { | |
throw new OneKeyLocalError('fileBaseName is required'); | |
} | |
const baseName = params.fileBaseName; | |
const logFilePath = logger.transports.file.getFile().path; | |
const logDir = path.dirname(logFilePath); | |
const logFiles = await fsPromises.readdir(logDir); | |
const zipName = `${baseName}.zip`; | |
const tempDir = path.join(app.getPath('temp'), '@onekeyhq-desktop-logs'); | |
await fsPromises.mkdir(tempDir, { recursive: true }); | |
const zipPath = path.join(tempDir, zipName); | |
const zip = new AdmZip(); | |
logFiles | |
.filter((fileName) => fileName.endsWith('.log')) | |
.forEach((fileName) => { | |
zip.addLocalFile(path.join(logDir, fileName), '', fileName); | |
}); | |
if (zip.getEntries().length === 0) { | |
throw new OneKeyLocalError('No log files found to collect'); | |
} | |
zip.writeZip(zipPath); | |
const fileBuffer = await fsPromises.readFile(zipPath); | |
const sizeBytes = fileBuffer.length; | |
const sha256Hex = createHash('sha256').update(fileBuffer).digest('hex'); | |
return { | |
filePath: zipPath, | |
fileName: zipName, | |
mimeType: 'application/zip', | |
sizeBytes, | |
sha256: sha256Hex, | |
}; | |
} |
useEffect(() => { | ||
const handler = ({ stage, progressPercent, message }: any) => { | ||
console.log( | ||
'[LoggerDemo][upload-progress]', | ||
stage, | ||
progressPercent, | ||
message, | ||
); | ||
}; | ||
appEventBus.on(EAppEventBusNames.ClientLogUploadProgress, handler); | ||
return () => { | ||
appEventBus.off(EAppEventBusNames.ClientLogUploadProgress, handler); | ||
}; | ||
}, []); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Type the progress event payload.
Replace any with a typed payload for clarity and safety.
Apply this diff:
+import type { ELogUploadStage } from '@onekeyhq/shared/src/logger/types';
+
+type UploadProgressPayload = {
+ stage: ELogUploadStage;
+ progressPercent?: number;
+ message?: string;
+};
...
- useEffect(() => {
- const handler = ({ stage, progressPercent, message }: any) => {
+ useEffect(() => {
+ const handler = ({ stage, progressPercent, message }: UploadProgressPayload) => {
🤖 Prompt for AI Agents
In packages/kit/src/views/Developer/pages/Gallery/Components/stories/Logger.tsx
around lines 27 to 40, the handler currently types its parameter as any; replace
it with a concrete payload type (either import the existing event payload type
for EAppEventBusNames.ClientLogUploadProgress or declare a local interface) that
at minimum types stage (string or the specific enum), progressPercent (number)
and message (optional string). Update the handler signature to use that type,
adjust any usages accordingly, and add the necessary import if pulling the event
payload type from the event-bus types.
const uploadLog = useCallback(async () => { | ||
const digest = await collectLogDigest('onekey_logs'); | ||
console.log('Log Digest:', digest); | ||
const token = await backgroundApiProxy.serviceLogger.requestUploadToken({ | ||
sizeBytes: digest.sizeBytes, | ||
sha256: digest.sha256, | ||
}); | ||
console.log('Upload token:', token); | ||
|
||
const res = await uploadLogBundle({ | ||
uploadToken: token.uploadToken, | ||
digest, | ||
}); | ||
console.log('Upload result:', res); | ||
}, []); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Harden upload flow with error handling.
Wrap in try/catch and surface failures (Toast or alert). Prevents silent failures in demos.
Apply this diff:
- const uploadLog = useCallback(async () => {
- const digest = await collectLogDigest('onekey_logs');
- console.log('Log Digest:', digest);
- const token = await backgroundApiProxy.serviceLogger.requestUploadToken({
- sizeBytes: digest.sizeBytes,
- sha256: digest.sha256,
- });
- console.log('Upload token:', token);
-
- const res = await uploadLogBundle({
- uploadToken: token.uploadToken,
- digest,
- });
- console.log('Upload result:', res);
- }, []);
+ const uploadLog = useCallback(async () => {
+ try {
+ const digest = await collectLogDigest('onekey_logs');
+ const token = await backgroundApiProxy.serviceLogger.requestUploadToken({
+ sizeBytes: digest.sizeBytes,
+ sha256: digest.sha256,
+ });
+ const res = await uploadLogBundle({
+ uploadToken: token.uploadToken,
+ digest,
+ });
+ console.log('Upload result:', res);
+ } catch (e) {
+ console.error('Upload log failed:', e);
+ }
+ }, []);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const uploadLog = useCallback(async () => { | |
const digest = await collectLogDigest('onekey_logs'); | |
console.log('Log Digest:', digest); | |
const token = await backgroundApiProxy.serviceLogger.requestUploadToken({ | |
sizeBytes: digest.sizeBytes, | |
sha256: digest.sha256, | |
}); | |
console.log('Upload token:', token); | |
const res = await uploadLogBundle({ | |
uploadToken: token.uploadToken, | |
digest, | |
}); | |
console.log('Upload result:', res); | |
}, []); | |
const uploadLog = useCallback(async () => { | |
try { | |
const digest = await collectLogDigest('onekey_logs'); | |
const token = await backgroundApiProxy.serviceLogger.requestUploadToken({ | |
sizeBytes: digest.sizeBytes, | |
sha256: digest.sha256, | |
}); | |
const res = await uploadLogBundle({ | |
uploadToken: token.uploadToken, | |
digest, | |
}); | |
console.log('Upload result:', res); | |
} catch (e) { | |
console.error('Upload log failed:', e); | |
} | |
}, []); |
🤖 Prompt for AI Agents
In packages/kit/src/views/Developer/pages/Gallery/Components/stories/Logger.tsx
around lines 46 to 60, the uploadLog useCallback lacks error handling and can
fail silently; wrap the async body in a try/catch, log errors to console, and
surface failures to the user by showing a Toast or alert with a clear message
(e.g., "Upload failed: <error message>"); ensure you still await the same calls
(collectLogDigest, requestUploadToken, uploadLogBundle) inside try, and in catch
show the Toast/alert and optionally rethrow or return so callers know the
operation failed.
const digest = await collectLogDigest('onekey_logs'); | ||
console.log('Log Digest:', digest); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avoid logging the full digest.
Digest may include file paths and metadata. Log only high‑level info or remove.
Apply this diff:
- console.log('Log Digest:', digest);
+ // console.log('Log Digest collected')
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const digest = await collectLogDigest('onekey_logs'); | |
console.log('Log Digest:', digest); | |
const digest = await collectLogDigest('onekey_logs'); | |
// console.log('Log Digest collected') |
🤖 Prompt for AI Agents
In packages/kit/src/views/Developer/pages/Gallery/Components/stories/Logger.tsx
around lines 47 to 48, the code currently logs the full digest (which may
contain file paths and sensitive metadata); instead, remove the full console.log
or replace it with logging only high‑level info (e.g., count of entries,
earliest/latest timestamp, or a redacted summary). Update the code to compute a
minimal summary from digest and log that, or simply omit logging the digest
entirely.
const token = await backgroundApiProxy.serviceLogger.requestUploadToken({ | ||
sizeBytes: digest.sizeBytes, | ||
sha256: digest.sha256, | ||
}); | ||
console.log('Upload token:', token); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not log secrets (upload token).
Printing the upload token to console leaks credentials. Remove this log.
Apply this diff:
- const token = await backgroundApiProxy.serviceLogger.requestUploadToken({
+ const token = await backgroundApiProxy.serviceLogger.requestUploadToken({
sizeBytes: digest.sizeBytes,
sha256: digest.sha256,
});
- console.log('Upload token:', token);
+ // avoid logging tokens or secrets
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const token = await backgroundApiProxy.serviceLogger.requestUploadToken({ | |
sizeBytes: digest.sizeBytes, | |
sha256: digest.sha256, | |
}); | |
console.log('Upload token:', token); | |
const token = await backgroundApiProxy.serviceLogger.requestUploadToken({ | |
sizeBytes: digest.sizeBytes, | |
sha256: digest.sha256, | |
}); | |
// avoid logging tokens or secrets |
🤖 Prompt for AI Agents
In packages/kit/src/views/Developer/pages/Gallery/Components/stories/Logger.tsx
around lines 49 to 54, the code prints the upload token to the console which
leaks a secret; remove the console.log('Upload token:', token) line (or replace
it with a non-secret alternative such as logging a success message or a
masked/boolean indicator) so the actual token value is never emitted to logs or
console.
const fileBuffer = await RNFS.readFile(normalizedPath, 'base64'); | ||
const bytes = Buffer.from(fileBuffer, 'base64'); | ||
const hashBytes = await appCrypto.hash.sha256(bufferUtils.toBuffer(bytes)); | ||
const sha256 = bufferUtils.bytesToHex(hashBytes); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avoid relying on global Buffer in React Native.
RN may not polyfill Buffer. Import it explicitly or avoid Buffer entirely.
Apply this diff:
+import { Buffer } from 'buffer';
...
- const fileBuffer = await RNFS.readFile(normalizedPath, 'base64');
- const bytes = Buffer.from(fileBuffer, 'base64');
- const hashBytes = await appCrypto.hash.sha256(bufferUtils.toBuffer(bytes));
+ const fileBase64 = await RNFS.readFile(normalizedPath, 'base64');
+ const bytes = Buffer.from(fileBase64, 'base64');
+ const hashBytes = await appCrypto.hash.sha256(bufferUtils.toBuffer(bytes));
If buffer
isn’t in deps, add it. Alternatively, switch to a base64->bytes helper in bufferUtils
if available.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const fileBuffer = await RNFS.readFile(normalizedPath, 'base64'); | |
const bytes = Buffer.from(fileBuffer, 'base64'); | |
const hashBytes = await appCrypto.hash.sha256(bufferUtils.toBuffer(bytes)); | |
const sha256 = bufferUtils.bytesToHex(hashBytes); | |
import { Buffer } from 'buffer'; | |
const fileBase64 = await RNFS.readFile(normalizedPath, 'base64'); | |
const bytes = Buffer.from(fileBase64, 'base64'); | |
const hashBytes = await appCrypto.hash.sha256(bufferUtils.toBuffer(bytes)); | |
const sha256 = bufferUtils.bytesToHex(hashBytes); |
🤖 Prompt for AI Agents
In packages/kit/src/views/Setting/pages/Tab/exportLogs/index.native.ts around
lines 76-80, the code uses the global Buffer which may not exist in React
Native; replace the global Buffer usage by either (A) importing Buffer
explicitly from the 'buffer' package and ensuring 'buffer' is added to
dependencies, then use Buffer.from(fileBuffer, 'base64'), or (B) avoid Buffer
completely by converting base64 to bytes via an existing helper in bufferUtils
(e.g., bufferUtils.base64ToBytes or a similar function) and pass that result to
appCrypto.hash.sha256; ensure imports are updated accordingly and only one
approach is used project-wide.
export const collectLogDigest = async ( | ||
fileBaseName?: string, | ||
): Promise<ILogDigest> => { | ||
appEventBus.emit(EAppEventBusNames.ClientLogUploadProgress, { | ||
stage: ELogUploadStage.Collecting, | ||
progressPercent: 0, | ||
}); | ||
const baseName = fileBaseName ?? buildDefaultFileBaseName(); | ||
defaultLogger.setting.device.logDeviceInfo(); | ||
await waitAsync(1000); | ||
const allMsgs = await backgroundApiProxy.serviceLogger.getAllMsg(); | ||
const element = document.createElement('a'); | ||
const file = new Blob(allMsgs, { | ||
type: 'text/plain', | ||
const messages = await backgroundApiProxy.serviceLogger.getAllMsg(); | ||
const content = messages.join(''); | ||
const blob = new Blob(messages, { | ||
type: LOG_MIME_TYPE, | ||
endings: 'native', | ||
}); | ||
element.href = URL.createObjectURL(file); | ||
element.download = logName; | ||
const arrayBuffer = await blob.arrayBuffer(); | ||
const byteBuffer = bufferUtils.toBuffer(new Uint8Array(arrayBuffer)); | ||
const sizeBytes = byteBuffer.length; | ||
const hashHex = | ||
sizeBytes > 0 | ||
? bufferUtils.bytesToHex(await appCrypto.hash.sha256(byteBuffer)) | ||
: EMPTY_SHA256_HEX; | ||
return { | ||
sizeBytes, | ||
sha256: hashHex, | ||
bundle: { | ||
type: 'text', | ||
fileName: `${baseName}.${LOG_FILE_EXTENSION}`, | ||
mimeType: LOG_MIME_TYPE, | ||
blob, | ||
content, | ||
}, | ||
}; | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add error handling and completion event.
The function emits a progress event at the start but lacks:
- A completion event when collection finishes (before returning)
- Error handling to emit
ELogUploadStage.Error
if collection fails
This creates inconsistency with uploadLogBundle
, which has comprehensive error handling.
Apply this diff to add error handling:
export const collectLogDigest = async (
fileBaseName?: string,
): Promise<ILogDigest> => {
+ try {
appEventBus.emit(EAppEventBusNames.ClientLogUploadProgress, {
stage: ELogUploadStage.Collecting,
progressPercent: 0,
});
const baseName = fileBaseName ?? buildDefaultFileBaseName();
defaultLogger.setting.device.logDeviceInfo();
await waitAsync(1000);
const messages = await backgroundApiProxy.serviceLogger.getAllMsg();
const content = messages.join('');
const blob = new Blob(messages, {
type: LOG_MIME_TYPE,
endings: 'native',
});
const arrayBuffer = await blob.arrayBuffer();
const byteBuffer = bufferUtils.toBuffer(new Uint8Array(arrayBuffer));
const sizeBytes = byteBuffer.length;
const hashHex =
sizeBytes > 0
? bufferUtils.bytesToHex(await appCrypto.hash.sha256(byteBuffer))
: EMPTY_SHA256_HEX;
- return {
+ const digest = {
sizeBytes,
sha256: hashHex,
bundle: {
type: 'text',
fileName: `${baseName}.${LOG_FILE_EXTENSION}`,
mimeType: LOG_MIME_TYPE,
blob,
content,
},
};
+ appEventBus.emit(EAppEventBusNames.ClientLogUploadProgress, {
+ stage: ELogUploadStage.Collecting,
+ progressPercent: 100,
+ });
+ return digest;
+ } catch (error) {
+ appEventBus.emit(EAppEventBusNames.ClientLogUploadProgress, {
+ stage: ELogUploadStage.Error,
+ message: error instanceof Error ? error.message : String(error),
+ });
+ throw error;
+ }
};
export const exportLogs = async (filename?: string) => { | ||
const digest = await collectLogDigest(filename); | ||
if (digest.bundle.type !== 'text') { | ||
throw new OneKeyLocalError('Cannot export non-text log bundle'); | ||
} | ||
const element = document.createElement('a'); | ||
element.href = URL.createObjectURL(digest.bundle.blob); | ||
element.download = digest.bundle.fileName; | ||
document.body.appendChild(element); // Required for this to work in FireFox | ||
element.click(); | ||
element.remove(); | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix memory leak from object URL.
URL.createObjectURL()
at line 67 creates a URL that remains in memory until revoked. Revoke it after the download.
Apply this diff:
export const exportLogs = async (filename?: string) => {
const digest = await collectLogDigest(filename);
if (digest.bundle.type !== 'text') {
throw new OneKeyLocalError('Cannot export non-text log bundle');
}
const element = document.createElement('a');
- element.href = URL.createObjectURL(digest.bundle.blob);
+ const url = URL.createObjectURL(digest.bundle.blob);
+ element.href = url;
element.download = digest.bundle.fileName;
document.body.appendChild(element); // Required for this to work in FireFox
element.click();
element.remove();
+ URL.revokeObjectURL(url);
};
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
export const exportLogs = async (filename?: string) => { | |
const digest = await collectLogDigest(filename); | |
if (digest.bundle.type !== 'text') { | |
throw new OneKeyLocalError('Cannot export non-text log bundle'); | |
} | |
const element = document.createElement('a'); | |
element.href = URL.createObjectURL(digest.bundle.blob); | |
element.download = digest.bundle.fileName; | |
document.body.appendChild(element); // Required for this to work in FireFox | |
element.click(); | |
element.remove(); | |
}; | |
export const exportLogs = async (filename?: string) => { | |
const digest = await collectLogDigest(filename); | |
if (digest.bundle.type !== 'text') { | |
throw new OneKeyLocalError('Cannot export non-text log bundle'); | |
} | |
const element = document.createElement('a'); | |
const url = URL.createObjectURL(digest.bundle.blob); | |
element.href = url; | |
element.download = digest.bundle.fileName; | |
document.body.appendChild(element); // Required for this to work in FireFox | |
element.click(); | |
element.remove(); | |
URL.revokeObjectURL(url); | |
}; |
🤖 Prompt for AI Agents
In packages/kit/src/views/Setting/pages/Tab/exportLogs/index.ts around lines 61
to 72, revoke the object URL created by URL.createObjectURL to avoid a memory
leak: capture the URL returned by createObjectURL in a local variable, use that
variable for element.href, and call URL.revokeObjectURL(url) after triggering
the download (e.g., immediately after element.click(), or inside a short
setTimeout) so the blob URL is released from memory.
const MAX_RETRIES = 3; | ||
const TOTAL_ATTEMPTS = MAX_RETRIES; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix off‑by‑one in retry attempt display.
Include the initial try in totals and stop subtracting 1. This keeps UI honest.
Apply this diff:
-const MAX_RETRIES = 3;
-const TOTAL_ATTEMPTS = MAX_RETRIES;
+const MAX_RETRIES = 3;
+const TOTAL_ATTEMPTS = MAX_RETRIES + 1; // initial attempt + retries
...
- onFailedAttempt: (error: FailedAttemptError) => {
+ onFailedAttempt: (error: FailedAttemptError) => {
const originalError = resolveError(error);
const message =
originalError.message ||
- `Log upload failed (attempt ${
- error.attemptNumber - 1
- }/${TOTAL_ATTEMPTS}). Retrying...`;
- setCurrentAttempt(error.attemptNumber - 1);
+ `Log upload failed (attempt ${error.attemptNumber}/${TOTAL_ATTEMPTS}). Retrying...`;
+ setCurrentAttempt(error.attemptNumber);
setStage(ELogUploadStage.Error);
setErrorMessage(message);
},
Also applies to: 134-144
🤖 Prompt for AI Agents
In packages/kit/src/views/Setting/pages/Tab/exportLogs/showExportLogsDialog.tsx
around lines 23-25 and also review lines 134-144, the UI currently shows an
off-by-one in retry attempt counts by excluding the initial try; change
TOTAL_ATTEMPTS to include the initial attempt (set TOTAL_ATTEMPTS = MAX_RETRIES
+ 1) and update any places in lines 134-144 that compute/display attempts or
remaining tries to stop subtracting 1 (remove the "- 1" adjustments) so the
displayed totals and remaining attempts reflect the true number of attempts.
title: 'Logs uploaded successfully', | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Localize hardcoded strings.
Use intl messages for: “Logs uploaded successfully”, “Copy Instance ID”, and progress labels.
Also applies to: 217-219, 195-199
🤖 Prompt for AI Agents
packages/kit/src/views/Setting/pages/Tab/exportLogs/showExportLogsDialog.tsx
around lines 154-155 (and also update lines 195-199 and 217-219): currently
several UI strings are hardcoded ("Logs uploaded successfully", "Copy Instance
ID", and progress labels). Replace these with intl message lookups (e.g., use
props.intl.formatMessage or the project's useIntl hook/messages pattern) and add
corresponding keys to the component's message definitions; update the
notification title, the copy button label, and all progress label usages to use
the intl messages, passing any dynamic values as message variables. Ensure new
message keys are added to the relevant messages file or the component's messages
object and used consistently across the three line ranges.
Summary by CodeRabbit
Release Notes
New Features