Compare commits
No commits in common. "master" and "pages" have entirely different histories.
|
|
@ -1,53 +0,0 @@
|
|||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
build/
|
||||
develop-eggs/
|
||||
dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
*.egg
|
||||
|
||||
# Virtual environments
|
||||
.venv/
|
||||
venv/
|
||||
ENV/
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Testing
|
||||
.pytest_cache/
|
||||
.coverage
|
||||
htmlcov/
|
||||
.tox/
|
||||
.nox/
|
||||
|
||||
# mypy
|
||||
.mypy_cache/
|
||||
|
||||
# Local config
|
||||
.env
|
||||
*.local.yaml
|
||||
|
||||
# Documentation symlink (points to project-docs)
|
||||
docs
|
||||
90
CLAUDE.md
90
CLAUDE.md
|
|
@ -1,90 +0,0 @@
|
|||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
**Live Two-Way Chat** - Real-time conversational AI that simulates natural human conversation, moving beyond forum-style turn-taking.
|
||||
|
||||
### Vision
|
||||
|
||||
Current chatbots work like forums - wait for input, generate response, repeat. This project aims for natural conversation:
|
||||
|
||||
- **Continuous transcription** - Voice transcribed in small chunks, not waiting for silence
|
||||
- **Predictive responses** - AI pre-prepares replies, modifying as context arrives
|
||||
- **Natural interruption** - AI decides when to speak (interrupt with important point, wait for question)
|
||||
- **Bidirectional listening** - AI listens even while speaking, handles interruptions gracefully
|
||||
- **Shared context window** - Drag-and-drop workspace for images, code, documents
|
||||
|
||||
### Shared Context Window
|
||||
|
||||
A visual workspace both human and AI can see/edit:
|
||||
- Images: displayed and analyzed
|
||||
- Code: displayed, editable by both
|
||||
- Split view: multiple files at once
|
||||
|
||||
## Development Commands
|
||||
|
||||
```bash
|
||||
# Install for development
|
||||
pip install -e ".[dev]"
|
||||
|
||||
# Run tests
|
||||
pytest
|
||||
|
||||
# Run the demo
|
||||
python -m live_two_way_chat.demo
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
### Components
|
||||
|
||||
1. **Streaming ASR** - Real-time speech-to-text (Whisper or similar)
|
||||
2. **Response Engine** - Predictive response generation with incremental updates
|
||||
3. **Turn-Taking Model** - Decides when to speak/wait/interrupt
|
||||
4. **TTS Output** - Text-to-speech with ducking for interruptions
|
||||
5. **Context Window** - Shared visual workspace (PyQt6)
|
||||
|
||||
### Key Modules (planned)
|
||||
|
||||
- `src/live_two_way_chat/asr.py` - Streaming speech recognition
|
||||
- `src/live_two_way_chat/response.py` - Predictive response engine
|
||||
- `src/live_two_way_chat/turn_taking.py` - Conversation flow control
|
||||
- `src/live_two_way_chat/tts.py` - Text-to-speech output
|
||||
- `src/live_two_way_chat/context_window.py` - Shared workspace UI
|
||||
- `src/live_two_way_chat/main.py` - Application entry point
|
||||
|
||||
### Key Paths
|
||||
|
||||
- **Source code**: `src/live_two_way_chat/`
|
||||
- **Tests**: `tests/`
|
||||
- **Documentation**: `docs/` (symlink to project-docs)
|
||||
|
||||
## Technical Challenges
|
||||
|
||||
1. Low-latency streaming ASR
|
||||
2. Incremental response generation (partial responses that update)
|
||||
3. Turn-taking model (when to speak/wait/interrupt)
|
||||
4. Context threading during interruptions
|
||||
5. Audio ducking for simultaneous speech
|
||||
|
||||
## Documentation
|
||||
|
||||
Documentation for this project lives in the centralized docs system:
|
||||
|
||||
- **Source**: `~/PycharmProjects/project-docs/docs/projects/live-two-way-chat/`
|
||||
- **Public URL**: `https://pages.brrd.tech/rob/live-two-way-chat/`
|
||||
|
||||
When updating documentation:
|
||||
1. Edit files in `docs/` (the symlink) or the full path above
|
||||
2. Use `public: true` frontmatter for public-facing docs
|
||||
3. Use `<!-- PRIVATE_START -->` / `<!-- PRIVATE_END -->` to hide sections
|
||||
4. Run `~/PycharmProjects/project-docs/scripts/build-public-docs.sh live-two-way-chat --deploy` to publish
|
||||
|
||||
Do NOT create documentation files directly in this repository.
|
||||
|
||||
## Related Projects
|
||||
|
||||
- **Ramble** - Voice transcription (could provide ASR component)
|
||||
- **Artifact Editor** - Could power the shared context window
|
||||
39
README.md
39
README.md
|
|
@ -1,39 +0,0 @@
|
|||
# Live Two-Way Chat
|
||||
|
||||
Real-time conversational AI with natural speech flow
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
*TODO: Add usage instructions*
|
||||
|
||||
## Documentation
|
||||
|
||||
Full documentation is available at: https://pages.brrd.tech/rob/live-two-way-chat/
|
||||
|
||||
## Development
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://gitea.brrd.tech/rob/live-two-way-chat.git
|
||||
cd live-two-way-chat
|
||||
|
||||
# Create virtual environment
|
||||
python -m venv .venv
|
||||
source .venv/bin/activate
|
||||
|
||||
# Install for development
|
||||
pip install -e ".[dev]"
|
||||
|
||||
# Run tests
|
||||
pytest
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
*TODO: Add license*
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[237],{2237(e,t,i){i.r(t),i.d(t,{default:()=>l});i(6540);var o=i(1312),n=i(5500),s=i(1656),r=i(3363),a=i(4848);function l(){const e=(0,o.T)({id:"theme.NotFound.title",message:"Page Not Found"});return(0,a.jsxs)(a.Fragment,{children:[(0,a.jsx)(n.be,{title:e}),(0,a.jsx)(s.A,{children:(0,a.jsx)(r.A,{})})]})}},3363(e,t,i){i.d(t,{A:()=>a});i(6540);var o=i(4164),n=i(1312),s=i(1107),r=i(4848);function a({className:e}){return(0,r.jsx)("main",{className:(0,o.A)("container margin-vert--xl",e),children:(0,r.jsx)("div",{className:"row",children:(0,r.jsxs)("div",{className:"col col--6 col--offset-3",children:[(0,r.jsx)(s.A,{as:"h1",className:"hero__title",children:(0,r.jsx)(n.A,{id:"theme.NotFound.title",description:"The title of the 404 page",children:"Page Not Found"})}),(0,r.jsx)("p",{children:(0,r.jsx)(n.A,{id:"theme.NotFound.p1",description:"The first paragraph of the 404 page",children:"We could not find what you were looking for."})}),(0,r.jsx)("p",{children:(0,r.jsx)(n.A,{id:"theme.NotFound.p2",description:"The 2nd paragraph of the 404 page",children:"Please contact the owner of the site that linked you to the original URL and let them know their link is broken."})})]})})})}}}]);
|
||||
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[647],{7121(e,c,s){s.r(c),s.d(c,{default:()=>t});s(6540);var r=s(4164),u=s(7559),a=s(5500),l=s(2831),o=s(1656),p=s(4848);function t(e){return(0,p.jsx)(a.e3,{className:(0,r.A)(u.G.wrapper.docsPages),children:(0,p.jsx)(o.A,{children:(0,l.v)(e.route.routes)})})}}}]);
|
||||
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[894],{7836(e,t,s){s.r(t),s.d(t,{assets:()=>c,contentTitle:()=>o,default:()=>u,frontMatter:()=>a,metadata:()=>i,toc:()=>r});const i=JSON.parse('{"id":"goals","title":"Goals","description":"Active","source":"@site/docs/goals.md","sourceDirName":".","slug":"/goals","permalink":"/rob/live-two-way-chat/goals","draft":false,"unlisted":false,"tags":[],"version":"current","frontMatter":{"type":"goals","project":"live-two-way-chat","updated":"2026-01-07T00:00:00.000Z"},"sidebar":"docs","previous":{"title":"Todos","permalink":"/rob/live-two-way-chat/todos"},"next":{"title":"Milestones","permalink":"/rob/live-two-way-chat/milestones"}}');var n=s(4848),l=s(8453);const a={type:"goals",project:"live-two-way-chat",updated:new Date("2026-01-07T00:00:00.000Z")},o="Goals",c={},r=[{value:"Active",id:"active",level:2},{value:"Future",id:"future",level:2},{value:"Non-Goals",id:"non-goals",level:2}];function d(e){const t={h1:"h1",h2:"h2",header:"header",input:"input",li:"li",strong:"strong",ul:"ul",...(0,l.R)(),...e.components};return(0,n.jsxs)(n.Fragment,{children:[(0,n.jsx)(t.header,{children:(0,n.jsx)(t.h1,{id:"goals",children:"Goals"})}),"\n",(0,n.jsx)(t.h2,{id:"active",children:"Active"}),"\n",(0,n.jsxs)(t.ul,{className:"contains-task-list",children:["\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Enable real-time bidirectional conversation between human and AI #high"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Support natural interruption and course-correction mid-conversation #high"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Implement continuous speech transcription with minimal latency #high"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Create a turn-taking model that feels natural, not robotic #high"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Allow AI to listen even while speaking (bidirectional listening) #medium"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Build predictive response preparation that adapts as context arrives #medium"]}),"\n"]}),"\n",(0,n.jsx)(t.h2,{id:"future",children:"Future"}),"\n",(0,n.jsxs)(t.ul,{className:"contains-task-list",children:["\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Create shared context window for visual collaboration #medium"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Support drag-and-drop of images, code, and documents #medium"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Enable AI to view and edit shared files #low"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Implement split view for multiple simultaneous files #low"]}),"\n",(0,n.jsxs)(t.li,{className:"task-list-item",children:[(0,n.jsx)(t.input,{type:"checkbox",disabled:!0})," ","Integrate with existing tools (Ramble for ASR, Artifact Editor for context) #low"]}),"\n"]}),"\n",(0,n.jsx)(t.h2,{id:"non-goals",children:"Non-Goals"}),"\n",(0,n.jsxs)(t.ul,{children:["\n",(0,n.jsxs)(t.li,{children:[(0,n.jsx)(t.strong,{children:"Full video conferencing"})," - This project focuses on audio conversation, not video"]}),"\n",(0,n.jsxs)(t.li,{children:[(0,n.jsx)(t.strong,{children:"Multi-participant group conversations"})," - Initial scope is 1:1 human-AI interaction"]}),"\n",(0,n.jsxs)(t.li,{children:[(0,n.jsx)(t.strong,{children:"Replacing traditional text chat"})," - Complementary to, not replacement for text interfaces"]}),"\n",(0,n.jsxs)(t.li,{children:[(0,n.jsx)(t.strong,{children:"Real-time translation"})," - Language translation is a separate concern"]}),"\n",(0,n.jsxs)(t.li,{children:[(0,n.jsx)(t.strong,{children:"Voice cloning or custom AI voices"})," - Focus on conversation flow, not voice quality"]}),"\n"]})]})}function u(e={}){const{wrapper:t}={...(0,l.R)(),...e.components};return t?(0,n.jsx)(t,{...e,children:(0,n.jsx)(d,{...e})}):d(e)}},8453(e,t,s){s.d(t,{R:()=>a,x:()=>o});var i=s(6540);const n={},l=i.createContext(n);function a(e){const t=i.useContext(l);return i.useMemo(function(){return"function"==typeof e?e(t):{...t,...e}},[t,e])}function o(e){let t;return t=e.disableParentContext?"function"==typeof e.components?e.components(n):e.components||n:a(e.components),i.createElement(l.Provider,{value:t},e.children)}}}]);
|
||||
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[574],{921(e,t,n){n.r(t),n.d(t,{assets:()=>c,contentTitle:()=>a,default:()=>h,frontMatter:()=>o,metadata:()=>r,toc:()=>l});const r=JSON.parse('{"id":"milestones","title":"Milestones","description":"Active","source":"@site/docs/milestones.md","sourceDirName":".","slug":"/milestones","permalink":"/rob/live-two-way-chat/milestones","draft":false,"unlisted":false,"tags":[],"version":"current","frontMatter":{"type":"milestones","project":"live-two-way-chat","updated":"2026-01-07T00:00:00.000Z"},"sidebar":"docs","previous":{"title":"Goals","permalink":"/rob/live-two-way-chat/goals"}}');var i=n(4848),s=n(8453);const o={type:"milestones",project:"live-two-way-chat",updated:new Date("2026-01-07T00:00:00.000Z")},a="Milestones",c={},l=[{value:"Active",id:"active",level:2},{value:"M1: Core Architecture",id:"m1-core-architecture",level:4},{value:"M2: Natural Interaction",id:"m2-natural-interaction",level:4},{value:"Future",id:"future",level:2},{value:"M3: Shared Context",id:"m3-shared-context",level:4},{value:"Completed",id:"completed",level:2}];function d(e){const t={h1:"h1",h2:"h2",h4:"h4",header:"header",hr:"hr",p:"p",strong:"strong",...(0,s.R)(),...e.components};return(0,i.jsxs)(i.Fragment,{children:[(0,i.jsx)(t.header,{children:(0,i.jsx)(t.h1,{id:"milestones",children:"Milestones"})}),"\n",(0,i.jsx)(t.h2,{id:"active",children:"Active"}),"\n",(0,i.jsx)(t.h4,{id:"m1-core-architecture",children:"M1: Core Architecture"}),"\n",(0,i.jsxs)(t.p,{children:[(0,i.jsx)(t.strong,{children:"Target"}),": Q2 2026\n",(0,i.jsx)(t.strong,{children:"Status"}),": Not Started"]}),"\n",(0,i.jsx)(t.p,{children:"Establish the foundational real-time conversation infrastructure including streaming speech recognition, incremental response generation, and the basic turn-taking model for natural dialogue flow."}),"\n",(0,i.jsx)(t.hr,{}),"\n",(0,i.jsx)(t.h4,{id:"m2-natural-interaction",children:"M2: Natural Interaction"}),"\n",(0,i.jsxs)(t.p,{children:[(0,i.jsx)(t.strong,{children:"Target"}),": Q3 2026\n",(0,i.jsx)(t.strong,{children:"Status"}),": Not Started"]}),"\n",(0,i.jsx)(t.p,{children:"Enable natural conversational dynamics including interruption handling, context threading for mid-conversation changes, and bidirectional listening so the AI can hear while speaking."}),"\n",(0,i.jsx)(t.hr,{}),"\n",(0,i.jsx)(t.h2,{id:"future",children:"Future"}),"\n",(0,i.jsx)(t.h4,{id:"m3-shared-context",children:"M3: Shared Context"}),"\n",(0,i.jsxs)(t.p,{children:[(0,i.jsx)(t.strong,{children:"Target"}),": Q4 2026\n",(0,i.jsx)(t.strong,{children:"Status"}),": Not Started"]}),"\n",(0,i.jsx)(t.p,{children:"Build the shared visual workspace where users can drag-and-drop images, code, and documents for the AI to see, reference, and edit collaboratively."}),"\n",(0,i.jsx)(t.hr,{}),"\n",(0,i.jsx)(t.h2,{id:"completed",children:"Completed"}),"\n",(0,i.jsx)(t.p,{children:"(No milestones completed yet - project in concept phase)"})]})}function h(e={}){const{wrapper:t}={...(0,s.R)(),...e.components};return t?(0,i.jsx)(t,{...e,children:(0,i.jsx)(d,{...e})}):d(e)}},8453(e,t,n){n.d(t,{R:()=>o,x:()=>a});var r=n(6540);const i={},s=r.createContext(i);function o(e){const t=r.useContext(s);return r.useMemo(function(){return"function"==typeof e?e(t):{...t,...e}},[t,e])}function a(e){let t;return t=e.disableParentContext?"function"==typeof e.components?e.components(i):e.components||i:o(e.components),r.createElement(s.Provider,{value:t},e.children)}}}]);
|
||||
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[98],{1723(n,e,s){s.r(e),s.d(e,{default:()=>d});s(6540);var r=s(5500);function o(n,e){return`docs-${n}-${e}`}var c=s(3025),t=s(2831),i=s(1463),u=s(4848);function l(n){const{version:e}=n;return(0,u.jsxs)(u.Fragment,{children:[(0,u.jsx)(i.A,{version:e.version,tag:o(e.pluginId,e.version)}),(0,u.jsx)(r.be,{children:e.noIndex&&(0,u.jsx)("meta",{name:"robots",content:"noindex, nofollow"})})]})}function a(n){const{version:e,route:s}=n;return(0,u.jsx)(r.e3,{className:e.className,children:(0,u.jsx)(c.n,{version:e,children:(0,t.v)(s.routes)})})}function d(n){return(0,u.jsxs)(u.Fragment,{children:[(0,u.jsx)(l,{...n}),(0,u.jsx)(a,{...n})]})}}}]);
|
||||
File diff suppressed because one or more lines are too long
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[742],{7093(c){c.exports=JSON.parse('{"name":"docusaurus-plugin-content-docs","id":"default"}')}}]);
|
||||
|
|
@ -0,0 +1 @@
|
|||
"use strict";(globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[]).push([[600],{2686(e){e.exports=JSON.parse('{"version":{"pluginId":"default","version":"current","label":"Next","banner":null,"badge":false,"noIndex":false,"className":"docs-version-current","isLast":true,"docsSidebars":{"docs":[{"type":"link","href":"/rob/live-two-way-chat/","label":"Live Two-Way Chat","docId":"overview","unlisted":false},{"type":"link","href":"/rob/live-two-way-chat/todos","label":"Todos","docId":"todos","unlisted":false},{"type":"link","href":"/rob/live-two-way-chat/goals","label":"Goals","docId":"goals","unlisted":false},{"type":"link","href":"/rob/live-two-way-chat/milestones","label":"Milestones","docId":"milestones","unlisted":false}]},"docs":{"goals":{"id":"goals","title":"Goals","description":"Active","sidebar":"docs"},"milestones":{"id":"milestones","title":"Milestones","description":"Active","sidebar":"docs"},"overview":{"id":"overview","title":"Live Two-Way Chat","description":"Real-time conversational AI with natural speech flow - moving beyond forum-style turn-taking.","sidebar":"docs"},"todos":{"id":"todos","title":"Todos","description":"High Priority","sidebar":"docs"}}}}')}}]);
|
||||
File diff suppressed because one or more lines are too long
|
|
@ -0,0 +1,61 @@
|
|||
/* NProgress, (c) 2013, 2014 Rico Sta. Cruz - http://ricostacruz.com/nprogress
|
||||
* @license MIT */
|
||||
|
||||
/**
|
||||
* @license React
|
||||
* react-dom-client.production.js
|
||||
*
|
||||
* Copyright (c) Meta Platforms, Inc. and affiliates.
|
||||
*
|
||||
* This source code is licensed under the MIT license found in the
|
||||
* LICENSE file in the root directory of this source tree.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @license React
|
||||
* react-dom.production.js
|
||||
*
|
||||
* Copyright (c) Meta Platforms, Inc. and affiliates.
|
||||
*
|
||||
* This source code is licensed under the MIT license found in the
|
||||
* LICENSE file in the root directory of this source tree.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @license React
|
||||
* react-jsx-runtime.production.js
|
||||
*
|
||||
* Copyright (c) Meta Platforms, Inc. and affiliates.
|
||||
*
|
||||
* This source code is licensed under the MIT license found in the
|
||||
* LICENSE file in the root directory of this source tree.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @license React
|
||||
* react.production.js
|
||||
*
|
||||
* Copyright (c) Meta Platforms, Inc. and affiliates.
|
||||
*
|
||||
* This source code is licensed under the MIT license found in the
|
||||
* LICENSE file in the root directory of this source tree.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @license React
|
||||
* scheduler.production.js
|
||||
*
|
||||
* Copyright (c) Meta Platforms, Inc. and affiliates.
|
||||
*
|
||||
* This source code is licensed under the MIT license found in the
|
||||
* LICENSE file in the root directory of this source tree.
|
||||
*/
|
||||
|
||||
/** @license React v16.13.1
|
||||
* react-is.production.min.js
|
||||
*
|
||||
* Copyright (c) Facebook, Inc. and its affiliates.
|
||||
*
|
||||
* This source code is licensed under the MIT license found in the
|
||||
* LICENSE file in the root directory of this source tree.
|
||||
*/
|
||||
|
|
@ -0,0 +1 @@
|
|||
(()=>{"use strict";var e,r,t,a,o,n={},i={};function d(e){var r=i[e];if(void 0!==r)return r.exports;var t=i[e]={id:e,loaded:!1,exports:{}};return n[e].call(t.exports,t,t.exports,d),t.loaded=!0,t.exports}d.m=n,d.c=i,e=[],d.O=(r,t,a,o)=>{if(!t){var n=1/0;for(f=0;f<e.length;f++){for(var[t,a,o]=e[f],i=!0,c=0;c<t.length;c++)(!1&o||n>=o)&&Object.keys(d.O).every(e=>d.O[e](t[c]))?t.splice(c--,1):(i=!1,o<n&&(n=o));if(i){e.splice(f--,1);var l=a();void 0!==l&&(r=l)}}return r}o=o||0;for(var f=e.length;f>0&&e[f-1][2]>o;f--)e[f]=e[f-1];e[f]=[t,a,o]},d.n=e=>{var r=e&&e.__esModule?()=>e.default:()=>e;return d.d(r,{a:r}),r},t=Object.getPrototypeOf?e=>Object.getPrototypeOf(e):e=>e.__proto__,d.t=function(e,a){if(1&a&&(e=this(e)),8&a)return e;if("object"==typeof e&&e){if(4&a&&e.__esModule)return e;if(16&a&&"function"==typeof e.then)return e}var o=Object.create(null);d.r(o);var n={};r=r||[null,t({}),t([]),t(t)];for(var i=2&a&&e;("object"==typeof i||"function"==typeof i)&&!~r.indexOf(i);i=t(i))Object.getOwnPropertyNames(i).forEach(r=>n[r]=()=>e[r]);return n.default=()=>e,d.d(o,n),o},d.d=(e,r)=>{for(var t in r)d.o(r,t)&&!d.o(e,t)&&Object.defineProperty(e,t,{enumerable:!0,get:r[t]})},d.f={},d.e=e=>Promise.all(Object.keys(d.f).reduce((r,t)=>(d.f[t](e,r),r),[])),d.u=e=>"assets/js/"+({48:"a94703ab",98:"a7bd4aaa",393:"1db78e9f",401:"17896441",413:"1db64337",574:"817f7194",600:"d79d0090",647:"5e95c892",742:"aba21aa0",894:"5eebbccf"}[e]||e)+"."+{48:"b8c77466",98:"3ba34601",237:"447ba118",393:"1d6cd5dc",401:"a2525508",413:"9be262b3",574:"7e9989e6",600:"800c3e75",647:"a3b66919",742:"4a552a5c",894:"359c28b9"}[e]+".js",d.miniCssF=e=>{},d.o=(e,r)=>Object.prototype.hasOwnProperty.call(e,r),a={},o="project-public-docs:",d.l=(e,r,t,n)=>{if(a[e])a[e].push(r);else{var i,c;if(void 0!==t)for(var l=document.getElementsByTagName("script"),f=0;f<l.length;f++){var u=l[f];if(u.getAttribute("src")==e||u.getAttribute("data-webpack")==o+t){i=u;break}}i||(c=!0,(i=document.createElement("script")).charset="utf-8",d.nc&&i.setAttribute("nonce",d.nc),i.setAttribute("data-webpack",o+t),i.src=e),a[e]=[r];var s=(r,t)=>{i.onerror=i.onload=null,clearTimeout(b);var o=a[e];if(delete a[e],i.parentNode&&i.parentNode.removeChild(i),o&&o.forEach(e=>e(t)),r)return r(t)},b=setTimeout(s.bind(null,void 0,{type:"timeout",target:i}),12e4);i.onerror=s.bind(null,i.onerror),i.onload=s.bind(null,i.onload),c&&document.head.appendChild(i)}},d.r=e=>{"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},d.p="/rob/live-two-way-chat/",d.gca=function(e){return e={17896441:"401",a94703ab:"48",a7bd4aaa:"98","1db78e9f":"393","1db64337":"413","817f7194":"574",d79d0090:"600","5e95c892":"647",aba21aa0:"742","5eebbccf":"894"}[e]||e,d.p+d.u(e)},(()=>{var e={354:0,869:0};d.f.j=(r,t)=>{var a=d.o(e,r)?e[r]:void 0;if(0!==a)if(a)t.push(a[2]);else if(/^(354|869)$/.test(r))e[r]=0;else{var o=new Promise((t,o)=>a=e[r]=[t,o]);t.push(a[2]=o);var n=d.p+d.u(r),i=new Error;d.l(n,t=>{if(d.o(e,r)&&(0!==(a=e[r])&&(e[r]=void 0),a)){var o=t&&("load"===t.type?"missing":t.type),n=t&&t.target&&t.target.src;i.message="Loading chunk "+r+" failed.\n("+o+": "+n+")",i.name="ChunkLoadError",i.type=o,i.request=n,a[1](i)}},"chunk-"+r,r)}},d.O.j=r=>0===e[r];var r=(r,t)=>{var a,o,[n,i,c]=t,l=0;if(n.some(r=>0!==e[r])){for(a in i)d.o(i,a)&&(d.m[a]=i[a]);if(c)var f=c(d)}for(r&&r(t);l<n.length;l++)o=n[l],d.o(e,o)&&e[o]&&e[o][0](),e[o]=0;return d.O(f)},t=globalThis.webpackChunkproject_public_docs=globalThis.webpackChunkproject_public_docs||[];t.forEach(r.bind(null,0)),t.push=r.bind(null,t.push.bind(t))})()})();
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
|
@ -1,23 +0,0 @@
|
|||
[build-system]
|
||||
requires = ["setuptools>=61.0", "wheel"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "live-two-way-chat"
|
||||
version = "0.1.0"
|
||||
description = "Real-time conversational AI with natural speech flow"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10"
|
||||
dependencies = []
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=7.0",
|
||||
"pytest-cov>=4.0",
|
||||
]
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["src"]
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
testpaths = ["tests"]
|
||||
|
|
@ -0,0 +1 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:news="http://www.google.com/schemas/sitemap-news/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml" xmlns:image="http://www.google.com/schemas/sitemap-image/1.1" xmlns:video="http://www.google.com/schemas/sitemap-video/1.1"><url><loc>https://pages.brrd.tech/rob/live-two-way-chat/goals</loc><changefreq>weekly</changefreq><priority>0.5</priority></url><url><loc>https://pages.brrd.tech/rob/live-two-way-chat/milestones</loc><changefreq>weekly</changefreq><priority>0.5</priority></url><url><loc>https://pages.brrd.tech/rob/live-two-way-chat/todos</loc><changefreq>weekly</changefreq><priority>0.5</priority></url><url><loc>https://pages.brrd.tech/rob/live-two-way-chat/</loc><changefreq>weekly</changefreq><priority>0.5</priority></url></urlset>
|
||||
File diff suppressed because one or more lines are too long
Loading…
Reference in New Issue