PeakeBot - Python AI
π§ PeakeBot: The Memory-Powered AI Now Trading and Learning Live

𧬠Introducing PeakeBot
PeakeBot is an on-chain, memory-augmented AI powered by a local neural language model and running autonomously from my Raspberry Pi cluster.
It:
- π¬ Learns from every prompt and response
- π§ Remembers past conversations across time and categories
- π Syncs entries to GeoCities at ftp.geocities.ws/peakecoin/peakebot
- π€ Trades multiple Hive Engine tokens using dynamic market strategies
- πͺΉ Posts directly to Hive using
beemviapeake.matic
π What Makes PeakeBot Unique?
"An AI that remembers what you asked it yesterday. And trades for you tomorrow."
β» Memory, Web & Neural Contexts
PeakeBot doesn't just generate responses in isolation β it pulls:
- π§ Recent memory logs (
peakebot_memory.json+ FTP history) - π Web context using DuckDuckGo search if a prompt asks for real-time data
- π‘ Internal model context, generated via
NeuralLanguageModelon-device
πΎ Everything Gets Stored to FTP
All interactions are:
- Categorized by the first 3 keywords in your prompt
- Saved to
/peakebot/[category]/entry-[timestamp].json - Included on a public-facing HTML journal at:
https://geocities.ws/peakecoin/peakebot/index.html
Want to teach it something new?
You: train
π Training mode ON. Type `hello | hi`
You: Iβm losing trades | Stop buying when RC is low
β
Learned: 'Iβm losing trades' β 'Stop buying when RC is low'
π Where It Lives
- π Journal: https://geocities.ws/peakecoin/peakebot/index.html
- π§ Memory: FTP-sync under
/peakebot - π§βπ» Codebase:
language_model.py,peakebot_memory.json,dashboard.py, anduni_xxx.pytrading stack - πͺΉ Hive Posting:
peake.matic&peakecoin.*sub-accounts
π₯ Next Up
Weβre integrating:
- π Encrypted RC-aware failsafes
- β» Feedback loop retraining from trade performance
- π§βπ« Collaborative memory sharing between bots
- π§± Optional Hive-based memory via
custom_jsonposts
π£ Get Involved
Want to help test or train PeakeBot? Just drop a prompt at:
https://geocities.ws/peakecoin/peakebot/index.html
Or tag @peake.matic and @paulmoon410 on Hive!
import json
import os
import time
from datetime import datetime
from beem import Hive
from beem.account import Account
from beem.nodelist import NodeList
from ftplib import FTP
import re
import requests
from language_model import NeuralLanguageModel
# Load the language model
model = NeuralLanguageModel()
model.load_model("language_model.pkl")
HISTORY_FILE = "peakebot_memory.json"
KEY_FILE = "hive_keys.json"
FTP_HOST = "ftp.yoursite.com"
FTP_USER = "peakecoin"
FTP_PASS = "password"
FTP_BASE_DIR = "/peakebot"
# Initialize Hive with posting key
def init_hive():
if not os.path.exists(KEY_FILE):
raise Exception("Missing hive_keys.json with posting key.")
with open(KEY_FILE) as f:
keys = json.load(f)
return Hive(keys=[keys["posting_key"]])
# Create a safe category directory path
def categorize_prompt(prompt):
keywords = re.findall(r"\b\w+\b", prompt.lower())
important = keywords[:3] if keywords else ["general"]
return "_".join(important)
# Fetch all previous entries from GeoCities FTP
def fetch_all_ftp_memory():
try:
ftp = FTP(FTP_HOST)
ftp.login(FTP_USER, FTP_PASS)
ftp.set_pasv(True)
ftp.cwd(FTP_BASE_DIR)
entries = []
for category in ftp.nlst():
try:
ftp.cwd(f"{FTP_BASE_DIR}/{category}")
files = ftp.nlst()
for fname in sorted(files):
if fname.endswith(".json"):
local_path = f"/tmp/{fname}"
with open(local_path, "wb") as f:
ftp.retrbinary(f"RETR {fname}", f.write)
with open(local_path, "r") as f:
entries.append(json.load(f))
except:
continue
ftp.quit()
return entries
except Exception as e:
print("β Could not fetch memory from GeoCities:", str(e))
return []
# Ensure memory file exists
if not os.path.exists(HISTORY_FILE):
with open(HISTORY_FILE, "w") as f:
json.dump([], f)
def load_memory(n=5):
ftp_memory = fetch_all_ftp_memory()
return ftp_memory[-n:]
def search_memory(query):
matches = []
for entry in fetch_all_ftp_memory():
if query.lower() in entry["prompt"].lower() or query.lower() in entry["response"].lower():
matches.append(entry)
return matches
def search_web(query):
"""Search the web using DuckDuckGo API (no API key required)"""
try:
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'}
url = "https://duckduckgo.com/api"
params = {'q': query, 'format': 'json'}
response = requests.get(url, params=params, headers=headers, timeout=5)
if response.status_code == 200:
data = response.json()
results = []
# Extract results from AbstractResult
if 'AbstractText' in data and data['AbstractText']:
results.append(data['AbstractText'][:500]) # First 500 chars
return " ".join(results[:3]) if results else None
except Exception as e:
print(f"β οΈ Web search failed: {str(e)}")
return None
def remember(prompt, response):
entry = {
"timestamp": datetime.now().isoformat(),
"prompt": prompt,
"response": response
}
with open(HISTORY_FILE, "r") as f:
memory = json.load(f)
memory.append(entry)
with open(HISTORY_FILE, "w") as f:
json.dump(memory[-50:], f, indent=2)
save_to_geocities(entry)
generate_webpage(memory[-50:])
def generate_response(prompt):
"""Generate response using language model, memory, and web search"""
# Check if user is asking for web search
web_trigger = any(phrase in prompt.lower() for phrase in ["search", "look up", "find out", "what is", "who is", "when did", "how do"])
# Build context from memory
memory_snippets = fetch_all_ftp_memory()
relevant = [m for m in memory_snippets if any(w in prompt.lower() for w in m["prompt"].lower().split() + m["response"].lower().split())]
memory_context = "\n".join([m['response'] for m in relevant[-3:]])
# Search the web if needed
web_info = ""
if web_trigger:
web_info = search_web(prompt)
if web_info:
print(f"π Web result: {web_info[:100]}...")
# Combine context for the model
full_context = ""
if memory_context:
full_context += "From my memory: " + memory_context + "\n"
if web_info:
full_context += "From the web: " + web_info + "\n"
# Generate response using the language model
try:
response = model.generate_response(prompt + "\nContext: " + full_context, max_length=100)
if not response or response == "":
response = "I'm thinking about that..."
except Exception as e:
print(f"β οΈ Model error: {str(e)}")
response = "I encountered an issue generating a response."
# Remember this interaction
remember(prompt, response)
return response
def post_to_hive(title, body):
hive = init_hive()
account = Account("peake.matic", blockchain_instance=hive)
tags = ["peakecoin", "ai", "bot"]
permlink = "peakebot-" + datetime.now().strftime("%Y%m%d%H%M%S")
try:
account.post(title, body, author="peake.matic", permlink=permlink, tags=tags)
print(f"β
Posted to Hive as {permlink}")
except Exception as e:
print("β Failed to post to Hive:", str(e))
def save_to_geocities(entry):
category = categorize_prompt(entry["prompt"])
filename = f"entry-{datetime.now().strftime('%Y%m%d-%H%M%S')}.json"
local_path = f"/tmp/{filename}"
with open(local_path, "w") as f:
json.dump(entry, f, indent=2)
try:
ftp = FTP(FTP_HOST)
ftp.login(FTP_USER, FTP_PASS)
ftp.set_pasv(True)
try:
ftp.cwd(f"{FTP_BASE_DIR}/{category}")
except:
ftp.cwd(FTP_BASE_DIR)
ftp.mkd(category)
ftp.cwd(f"{FTP_BASE_DIR}/{category}")
with open(local_path, "rb") as file:
ftp.storbinary(f"STOR {filename}", file)
ftp.quit()
print(f"π Uploaded memory to GeoCities under /{category}: {filename}")
except Exception as e:
print("β FTP Upload failed:", str(e))
def generate_webpage(entries):
'html_content = """' ##remove ' ' for functionality. It's written that way for the blog purposes.
<html>
<head><title>PeakeBot Journal</title></head>
<body>
<h1>π§ PeakeBot Public Journal</h1>
<p>AI entries generated by PeakeBot running on Raspberry Pi and published to GeoCities</p>
<hr>
"""
for entry in reversed(entries):
html_content += f"<div><h3>{entry['timestamp']}</h3><p><b>You:</b> {entry['prompt']}<br><b>PeakeBot:</b> {entry['response']}</p><hr></div>"
html_content += "</body></html>"
local_path = "/tmp/index.html"
with open(local_path, "w") as f:
f.write(html_content)
try:
ftp = FTP(FTP_HOST)
ftp.login(FTP_USER, FTP_PASS)
ftp.set_pasv(True)
ftp.cwd(FTP_BASE_DIR)
with open(local_path, "rb") as file:
ftp.storbinary("STOR index.html", file)
ftp.quit()
print("π Updated PeakeBot journal webpage on GeoCities.")
except Exception as e:
print("β FTP Upload (HTML) failed:", str(e))
def interactive_loop():
print("π§ PeakeBot ready. Type your message below:")
print("Commands: 'train' to teach new responses, 'save' to save learning, 'exit' to quit\n")
training_mode = False
while True:
prompt = input("You: ").strip()
if not prompt:
continue
if prompt.lower() == "exit" or prompt.lower() == "quit":
print("π Goodbye!")
break
if prompt.lower() == "train":
training_mode = not training_mode
if training_mode:
print("π Training mode ON. Enter pairs of [input] [response] separated by | (e.g., 'hello there | hi, how are you?')")
print("Type 'done' when finished.\n")
else:
print("π Training mode OFF\n")
continue
if prompt.lower() == "save":
model.save_model("language_model.pkl")
print("πΎ Model saved!\n")
continue
if training_mode:
if prompt.lower() == "done":
training_mode = False
print("π Training complete. Responses will improve over time!\n")
continue
# Parse training data
if "|" in prompt:
parts = prompt.split("|")
if len(parts) == 2:
user_input = parts[0].strip()
desired_response = parts[1].strip()
# Train model with this example
model.train_on_text([user_input + " " + desired_response], epochs=1)
print(f"β
Learned: '{user_input}' -> '{desired_response}'\n")
continue
print("β Format: input | response\n")
continue
# Normal conversation mode
response = generate_response(prompt)
print("PeakeBot:", response)
print()
if "post this" in prompt.lower():
post_to_hive("π§ PeakeBot Insight", response)
if __name__ == "__main__":
interactive_loop()
πͺ PeakeCoin Ecosystem
π± PeakeCoin USDT Bridge (Hive β Polygon/MATIC)
Bridge SWAP.USDT from Hive Engine to USDT on Polygon (MATIC).
Whitelist access, documentation, and bridge status updates:
π https://geocities.ws/peakecoin
βοΈ HiveP.I.M.P. β PeakeCoin Intelligent Market Protector
Operated by @hivepimp, P.I.M.P. stabilizes PEK markets and supports liquidity on Hive Engine.
Community liquidity participation strengthens long-term market health.
π Open-source code, bots, and documentation:
π https://github.com/paulmoon410
π° PeakeSino β The PeakeCoin Casino
Blockchain-powered games using PEK as the native in-game currency.
Built on Hive with a focus on provable fairness and community-driven growth.
π Casino code, UI, and game logic:
π https://github.com/paulmoon410
π Acknowledgements
Thanks to and please follow:
@enginewitty @ecoinstant @neoxian @txracer @thecrazygm @holdonia @aggroed
For their continued support, guidance, and help expanding the PeakeCoin ecosystem.
Impressive stuff
!LOLZ
!PIZZA
!PIMP
I'm thinking of having it just be and to search the Hive Blockchain for information.
$PIZZA slices delivered:
@fjworld(3/20) tipped @paulmoon410
Please vote for pizza.witness!