Compare commits

...

25 Commits

Author SHA1 Message Date
spinline
8a9905fc56 fix: WASM dosyasının bozulmasına neden olan hatalı manuel optimizasyon adımı kaldırıldı
All checks were successful
Build MIPS Binary / build (push) Successful in 6m45s
2026-02-08 18:21:40 +03:00
spinline
1e39cbb0c5 perf: backend binary boyutunu düşürmek için Swagger UI opsiyonel yapıldı ve build komutu optimize edildi
All checks were successful
Build MIPS Binary / build (push) Successful in 4m31s
2026-02-08 18:16:45 +03:00
spinline
40be58f2fc perf: backend derleme süreci kök dizine taşınarak workspace optimizasyonları aktif edildi
All checks were successful
Build MIPS Binary / build (push) Successful in 4m30s
2026-02-08 18:10:09 +03:00
spinline
3f08b5b54a perf: WASM boyut takibi loglara eklendi ve profil çakışmaları giderildi
All checks were successful
Build MIPS Binary / build (push) Successful in 4m27s
2026-02-08 18:03:52 +03:00
spinline
bfec99ae35 fix: wasm-opt için --all-features bayrağı kullanılarak flag uyuşmazlığı giderildi
All checks were successful
Build MIPS Binary / build (push) Successful in 4m28s
2026-02-08 17:58:14 +03:00
spinline
d9afd3aa81 fix: wasm-opt için nontrapping-float-to-int-conversions özelliği eklendi
Some checks failed
Build MIPS Binary / build (push) Failing after 1m2s
2026-02-08 17:56:28 +03:00
spinline
e72113d91d perf: manuel WASM optimizasyonu eklendi ve build süreci stabilize edildi
Some checks failed
Build MIPS Binary / build (push) Failing after 1m2s
2026-02-08 17:54:43 +03:00
spinline
7c4ff619c1 fix: .cargo/config.toml yazım hatası düzeltildi
Some checks failed
Build MIPS Binary / build (push) Failing after 1m2s
2026-02-08 17:52:15 +03:00
spinline
9c4217f450 feat: WASM için bulk-memory özelliği aktif edildi
Some checks failed
Build MIPS Binary / build (push) Failing after 3s
2026-02-08 17:51:04 +03:00
spinline
cc09002171 trigger: yeniden build başlatıldı
Some checks failed
Build MIPS Binary / build (push) Failing after 1m3s
2026-02-08 17:49:06 +03:00
spinline
5d8cdd7760 build: build ortamı güncellendi (Trunk v0.21.14 ve binaryen eklendi), optimizasyonlar tekrar açıldı
Some checks failed
Build MIPS Binary / build (push) Failing after 1m2s
2026-02-08 16:51:40 +03:00
spinline
145436eefc fix: build hatasını aşmak için wasm-opt geçici olarak devre dışı bırakıldı
All checks were successful
Build MIPS Binary / build (push) Successful in 4m30s
2026-02-08 16:44:29 +03:00
spinline
10c95c5ff3 fix: wasm-opt build hatası için rustc ve wasm-opt versiyon ayarları güncellendi
Some checks failed
Build MIPS Binary / build (push) Failing after 1m8s
2026-02-08 16:42:13 +03:00
spinline
329654cc4e fix: wasm-opt build hatası için bulk-memory özelliği devre dışı bırakıldı
Some checks failed
Build MIPS Binary / build (push) Failing after 1m31s
2026-02-08 16:37:45 +03:00
spinline
22b592a652 fix: wasm-opt seviyesi 'z' olarak güncellendi
Some checks failed
Build MIPS Binary / build (push) Failing after 1m35s
2026-02-08 16:33:46 +03:00
spinline
817dc49db2 fix: wasm-opt build hatası için --enable-bulk-memory flag'i eklendi
Some checks failed
Build MIPS Binary / build (push) Failing after 3s
2026-02-08 16:29:33 +03:00
spinline
b2a60d3d1e cleanup: kullanılmayan get_vapid_public_key fonksiyonu kaldırıldı
Some checks failed
Build MIPS Binary / build (push) Failing after 1m6s
2026-02-08 16:26:16 +03:00
spinline
520903fa3f perf: push bildirimleri paralel gönderim ve env var önbelleğe alma ile optimize edildi
Some checks failed
Build MIPS Binary / build (push) Has been cancelled
2026-02-08 16:25:44 +03:00
spinline
c45f2f50e9 fix: ARM64 build hatası için wasm-opt versiyonu v117 olarak güncellendi
Some checks failed
Build MIPS Binary / build (push) Has been cancelled
2026-02-08 16:25:02 +03:00
spinline
791eabe9bd fix: SQLite deadlock ve busy_timeout yönetimi iyileştirildi
Some checks failed
Build MIPS Binary / build (push) Failing after 1m2s
2026-02-08 16:20:55 +03:00
spinline
12f93dd640 perf: Trunk WASM optimizasyonu aktif edildi (data-wasm-opt=0 kaldırıldı)
Some checks failed
Build MIPS Binary / build (push) Failing after 1m2s
2026-02-08 16:18:50 +03:00
spinline
7306db8c2f fix: torrent diff algoritması hash tabanlı hale getirilerek sıralama bağımlılığı kaldırıldı
Some checks failed
Build MIPS Binary / build (push) Has been cancelled
2026-02-08 16:17:30 +03:00
spinline
ce0ecd62af fix: index.html yükleme ekranına zaman aşımı (15sn) ve hata mesajı eklendi
Some checks failed
Build MIPS Binary / build (push) Has been cancelled
2026-02-08 16:13:20 +03:00
spinline
f2379b67d8 docs: main.rs içindeki güncelliğini yitirmiş şifre güncelleme yorumu temizlendi
Some checks failed
Build MIPS Binary / build (push) Has been cancelled
2026-02-08 16:11:18 +03:00
spinline
755f35c94c security: gerçek .env dosyası takipten çıkarıldı ve .env.example güncellendi
Some checks failed
Build MIPS Binary / build (push) Has been cancelled
2026-02-08 16:07:26 +03:00
14 changed files with 217 additions and 95 deletions

View File

@@ -29,16 +29,18 @@ jobs:
# Run Tailwind manually first # Run Tailwind manually first
npx @tailwindcss/cli -i input.css -o public/tailwind.css npx @tailwindcss/cli -i input.css -o public/tailwind.css
trunk build --release trunk build --release
echo "Build complete (WASM optimization handled by Rust compiler via opt-level=z)"
- name: Build Backend (MIPS) - name: Build Backend (MIPS)
env: env:
# Ensure we are building a fully static binary # -s: Sembolleri siler, -w: DWARF debug bilgilerini siler (Binary boyutunu devasa düşürür)
# -C link-self-contained=no: Let Zig (the linker) handle CRT objects (crt1.o, etc.) RUSTFLAGS: "-C target-feature=+crt-static -C link-self-contained=no -C link-arg=-msoft-float -C link-arg=-s -C link-arg=-w"
RUSTFLAGS: "-C target-feature=+crt-static -C link-self-contained=no -C link-arg=-msoft-float"
CFLAGS_mips_unknown_linux_musl: "-msoft-float" CFLAGS_mips_unknown_linux_musl: "-msoft-float"
run: | run: |
cd backend # Kök dizinden derleyerek workspace profil ayarlarının (LTO, z, strip) uygulanmasını sağlıyoruz
cargo zigbuild --target mips-unknown-linux-musl --release -Z build-std=std,panic_abort # Sadece push-notifications özelliğini aktif ediyoruz (swagger UI kapanır, boyut düşer)
cargo zigbuild -p backend --target mips-unknown-linux-musl --release -Z build-std=std,panic_abort --no-default-features --features push-notifications
file target/mips-unknown-linux-musl/release/backend file target/mips-unknown-linux-musl/release/backend
- name: Rename Binary - name: Rename Binary

2
.gitignore vendored
View File

@@ -6,3 +6,5 @@ result.xml
frontend/dist frontend/dist
backend.log backend.log
.runner .runner
.env
backend/.env

View File

@@ -5,10 +5,11 @@ resolver = "2"
# Optimize for size (aggressive) # Optimize for size (aggressive)
[profile.release] [profile.release]
opt-level = "z" opt-level = "z"
lto = true lto = "fat" # Full LTO (En iyisi)
codegen-units = 1 codegen-units = 1
panic = "abort" panic = "abort"
strip = true strip = "symbols" # Sembolleri temizle
incremental = false # Incremental build'i kapat (Boyut için daha iyi)
[patch.crates-io] [patch.crates-io]
coarsetime = { path = "third_party/coarsetime" } coarsetime = { path = "third_party/coarsetime" }

View File

@@ -1,8 +0,0 @@
# Database
DATABASE_URL=sqlite:vibetorrent.db
# VAPID Keys for Push Notifications
# Generate new keys for production using: cargo run --bin web-push --features web-push -- generate-vapid-keys
VAPID_PUBLIC_KEY=BEdPj6XQR7MGzM28Nev9wokF5upHoydNDahouJbQ9ZdBJpEFAN1iNfANSEvY0ItasNY5zcvvqN_tjUt64Rfd0gU
VAPID_PRIVATE_KEY=aUcCYJ7kUd9UClCaWwad0IVgbYJ6svwl19MjSX7GH10
VAPID_EMAIL=mailto:admin@vibetorrent.app

View File

@@ -3,3 +3,12 @@ RTORRENT_SOCKET=/tmp/rtorrent.sock
# Backend Listen Port # Backend Listen Port
PORT=3000 PORT=3000
# Database URL
DATABASE_URL=sqlite:vibetorrent.db
# VAPID Keys for Push Notifications
# Generate new keys for production using: npx web-push generate-vapid-keys
VAPID_PUBLIC_KEY=YOUR_PUBLIC_VAPID_KEY
VAPID_PRIVATE_KEY=YOUR_PRIVATE_VAPID_KEY
VAPID_EMAIL=mailto:your-email@example.com

View File

@@ -6,6 +6,7 @@ edition = "2021"
[features] [features]
default = ["push-notifications"] default = ["push-notifications"]
push-notifications = ["web-push", "openssl"] push-notifications = ["web-push", "openssl"]
swagger = ["utoipa-swagger-ui"]
[dependencies] [dependencies]
axum = { version = "0.8", features = ["macros", "ws"] } axum = { version = "0.8", features = ["macros", "ws"] }
@@ -29,7 +30,7 @@ shared = { path = "../shared" }
thiserror = "2.0.18" thiserror = "2.0.18"
dotenvy = "0.15.7" dotenvy = "0.15.7"
utoipa = { version = "5.4.0", features = ["axum_extras"] } utoipa = { version = "5.4.0", features = ["axum_extras"] }
utoipa-swagger-ui = { version = "9.0.2", features = ["axum"] } utoipa-swagger-ui = { version = "9.0.2", features = ["axum"], optional = true }
web-push = { version = "0.10", default-features = false, features = ["hyper-client"], optional = true } web-push = { version = "0.10", default-features = false, features = ["hyper-client"], optional = true }
base64 = "0.22" base64 = "0.22"
openssl = { version = "0.10", features = ["vendored"], optional = true } openssl = { version = "0.10", features = ["vendored"], optional = true }

View File

@@ -1,6 +1,7 @@
use sqlx::{sqlite::SqlitePoolOptions, Pool, Sqlite, Row}; use sqlx::{sqlite::SqlitePoolOptions, Pool, Sqlite, Row, sqlite::SqliteConnectOptions};
use std::time::Duration; use std::time::Duration;
use anyhow::Result; use anyhow::Result;
use std::str::FromStr;
#[derive(Clone)] #[derive(Clone)]
pub struct Db { pub struct Db {
@@ -9,10 +10,16 @@ pub struct Db {
impl Db { impl Db {
pub async fn new(db_url: &str) -> Result<Self> { pub async fn new(db_url: &str) -> Result<Self> {
let options = SqliteConnectOptions::from_str(db_url)?
.create_if_missing(true)
.busy_timeout(Duration::from_secs(10)) // Bekleme süresini 10 saniyeye çıkardık
.journal_mode(sqlx::sqlite::SqliteJournalMode::Wal)
.synchronous(sqlx::sqlite::SqliteSynchronous::Normal);
let pool = SqlitePoolOptions::new() let pool = SqlitePoolOptions::new()
.max_connections(5) .max_connections(5)
.acquire_timeout(Duration::from_secs(3)) .acquire_timeout(Duration::from_secs(10))
.connect(db_url) .connect_with(options)
.await?; .await?;
let db = Self { pool }; let db = Self { pool };
@@ -21,21 +28,6 @@ impl Db {
} }
async fn run_migrations(&self) -> Result<()> { async fn run_migrations(&self) -> Result<()> {
// WAL mode - enables concurrent reads while writing
sqlx::query("PRAGMA journal_mode=WAL")
.execute(&self.pool)
.await?;
// NORMAL synchronous - faster than FULL, still safe enough
sqlx::query("PRAGMA synchronous=NORMAL")
.execute(&self.pool)
.await?;
// 5 second busy timeout - reduces "database locked" errors
sqlx::query("PRAGMA busy_timeout=5000")
.execute(&self.pool)
.await?;
sqlx::migrate!("./migrations").run(&self.pool).await?; sqlx::migrate!("./migrations").run(&self.pool).await?;
Ok(()) Ok(())
} }

View File

@@ -1,3 +1,4 @@
use std::collections::HashMap;
use shared::{AppEvent, NotificationLevel, SystemNotification, Torrent, TorrentUpdate}; use shared::{AppEvent, NotificationLevel, SystemNotification, Torrent, TorrentUpdate};
#[derive(Debug)] #[derive(Debug)]
@@ -8,24 +9,32 @@ pub enum DiffResult {
} }
pub fn diff_torrents(old: &[Torrent], new: &[Torrent]) -> DiffResult { pub fn diff_torrents(old: &[Torrent], new: &[Torrent]) -> DiffResult {
// 1. Structural Check (Length or Order changed) // 1. Structural Check: Eğer torrent sayısı değişmişse (yeni eklenen veya silinen),
// şimdilik basitlik adına FullUpdate gönderiyoruz.
if old.len() != new.len() { if old.len() != new.len() {
return DiffResult::FullUpdate; return DiffResult::FullUpdate;
} }
for (i, t) in new.iter().enumerate() { // 2. Hash Set Karşılaştırması:
if old[i].hash != t.hash { // Sıralama değişmiş olabilir ama torrentler aynı mı?
let old_map: HashMap<&str, &Torrent> = old.iter().map(|t| (t.hash.as_str(), t)).collect();
// Eğer yeni listedeki bir hash eski listede yoksa, yapı değişmiş demektir.
for new_t in new {
if !old_map.contains_key(new_t.hash.as_str()) {
return DiffResult::FullUpdate; return DiffResult::FullUpdate;
} }
} }
// 2. Field Updates // 3. Alan Güncellemeleri (Partial Updates)
// Buraya geldiğimizde biliyoruz ki old ve new listelerindeki torrentler (hash olarak) aynı,
// sadece sıraları farklı olabilir veya içindeki veriler güncellenmiş olabilir.
let mut events = Vec::new(); let mut events = Vec::new();
for (i, new_t) in new.iter().enumerate() { for new_t in new {
let old_t = &old[i]; // old_map'ten ilgili torrente hash ile ulaşalım (sıradan bağımsız)
let old_t = old_map.get(new_t.hash.as_str()).unwrap();
// Initialize with all None
let mut update = TorrentUpdate { let mut update = TorrentUpdate {
hash: new_t.hash.clone(), hash: new_t.hash.clone(),
name: None, name: None,
@@ -42,7 +51,7 @@ pub fn diff_torrents(old: &[Torrent], new: &[Torrent]) -> DiffResult {
let mut has_changes = false; let mut has_changes = false;
// Compare fields // Alanları karşılaştır
if old_t.name != new_t.name { if old_t.name != new_t.name {
update.name = Some(new_t.name.clone()); update.name = Some(new_t.name.clone());
has_changes = true; has_changes = true;
@@ -63,7 +72,7 @@ pub fn diff_torrents(old: &[Torrent], new: &[Torrent]) -> DiffResult {
update.percent_complete = Some(new_t.percent_complete); update.percent_complete = Some(new_t.percent_complete);
has_changes = true; has_changes = true;
// Check for torrent completion: reached 100% // Torrent tamamlanma kontrolü
if old_t.percent_complete < 100.0 && new_t.percent_complete >= 100.0 { if old_t.percent_complete < 100.0 && new_t.percent_complete >= 100.0 {
tracing::info!("Torrent completed: {} ({})", new_t.name, new_t.hash); tracing::info!("Torrent completed: {} ({})", new_t.name, new_t.hash);
events.push(AppEvent::Notification(SystemNotification { events.push(AppEvent::Notification(SystemNotification {
@@ -83,8 +92,7 @@ pub fn diff_torrents(old: &[Torrent], new: &[Torrent]) -> DiffResult {
if old_t.status != new_t.status { if old_t.status != new_t.status {
update.status = Some(new_t.status.clone()); update.status = Some(new_t.status.clone());
has_changes = true; has_changes = true;
// Log status changes for debugging
tracing::debug!( tracing::debug!(
"Torrent status changed: {} ({}) {:?} -> {:?}", "Torrent status changed: {} ({}) {:?} -> {:?}",
new_t.name, new_t.hash, old_t.status, new_t.status new_t.name, new_t.hash, old_t.status, new_t.status
@@ -110,4 +118,4 @@ pub fn diff_torrents(old: &[Torrent], new: &[Torrent]) -> DiffResult {
tracing::debug!("Generated {} partial updates", events.len()); tracing::debug!("Generated {} partial updates", events.len());
DiffResult::Partial(events) DiffResult::Partial(events)
} }
} }

View File

@@ -690,8 +690,10 @@ pub async fn handle_timeout_error(err: BoxError) -> (StatusCode, &'static str) {
(status = 200, description = "VAPID public key", body = String) (status = 200, description = "VAPID public key", body = String)
) )
)] )]
pub async fn get_push_public_key_handler() -> impl IntoResponse { pub async fn get_push_public_key_handler(
let public_key = push::get_vapid_public_key(); State(state): State<AppState>,
) -> impl IntoResponse {
let public_key = state.push_store.get_public_key();
(StatusCode::OK, Json(serde_json::json!({ "publicKey": public_key }))).into_response() (StatusCode::OK, Json(serde_json::json!({ "publicKey": public_key }))).into_response()
} }

View File

@@ -33,6 +33,7 @@ use tower_http::{
trace::TraceLayer, trace::TraceLayer,
}; };
use utoipa::OpenApi; use utoipa::OpenApi;
#[cfg(feature = "swagger")]
use utoipa_swagger_ui::SwaggerUi; use utoipa_swagger_ui::SwaggerUi;
#[derive(Clone)] #[derive(Clone)]
@@ -98,6 +99,7 @@ struct Args {
reset_password: Option<String>, reset_password: Option<String>,
} }
#[cfg(feature = "swagger")]
#[cfg(feature = "push-notifications")] #[cfg(feature = "push-notifications")]
#[derive(OpenApi)] #[derive(OpenApi)]
#[openapi( #[openapi(
@@ -146,6 +148,7 @@ struct Args {
)] )]
struct ApiDoc; struct ApiDoc;
#[cfg(feature = "swagger")]
#[cfg(not(feature = "push-notifications"))] #[cfg(not(feature = "push-notifications"))]
#[derive(OpenApi)] #[derive(OpenApi)]
#[openapi( #[openapi(
@@ -255,9 +258,7 @@ async fn main() {
} }
}; };
// Update in DB (using a direct query since db.rs doesn't have update_password yet) // Update in DB
// We should add `update_password` to db.rs for cleaner code, but for now direct query is fine or we can extend Db.
// Let's extend Db.rs first to be clean.
if let Err(e) = db.update_password(user_id, &password_hash).await { if let Err(e) = db.update_password(user_id, &password_hash).await {
tracing::error!("Failed to update password in DB: {}", e); tracing::error!("Failed to update password in DB: {}", e);
std::process::exit(1); std::process::exit(1);
@@ -464,9 +465,13 @@ async fn main() {
} }
}); });
let app = Router::new() let app = Router::new();
.merge(SwaggerUi::new("/swagger-ui").url("/api-docs/openapi.json", ApiDoc::openapi()))
// Setup & Auth Routes #[cfg(feature = "swagger")]
let app = app.merge(SwaggerUi::new("/swagger-ui").url("/api-docs/openapi.json", ApiDoc::openapi()));
// Setup & Auth Routes
let app = app
.route("/api/setup/status", get(handlers::setup::get_setup_status_handler)) .route("/api/setup/status", get(handlers::setup::get_setup_status_handler))
.route("/api/setup", post(handlers::setup::setup_handler)) .route("/api/setup", post(handlers::setup::setup_handler))
.route( .route(

View File

@@ -5,6 +5,7 @@ use utoipa::ToSchema;
use web_push::{ use web_push::{
HyperWebPushClient, SubscriptionInfo, VapidSignatureBuilder, WebPushClient, WebPushMessageBuilder, HyperWebPushClient, SubscriptionInfo, VapidSignatureBuilder, WebPushClient, WebPushMessageBuilder,
}; };
use futures::StreamExt;
use crate::db::Db; use crate::db::Db;
@@ -20,17 +21,34 @@ pub struct PushKeys {
pub auth: String, pub auth: String,
} }
#[derive(Clone)]
pub struct VapidConfig {
pub private_key: String,
pub public_key: String,
pub email: String,
}
#[derive(Clone)] #[derive(Clone)]
pub struct PushSubscriptionStore { pub struct PushSubscriptionStore {
db: Option<Db>, db: Option<Db>,
subscriptions: Arc<RwLock<Vec<PushSubscription>>>, subscriptions: Arc<RwLock<Vec<PushSubscription>>>,
vapid_config: VapidConfig,
} }
impl PushSubscriptionStore { impl PushSubscriptionStore {
pub fn new() -> Self { pub fn new() -> Self {
let private_key = std::env::var("VAPID_PRIVATE_KEY").expect("VAPID_PRIVATE_KEY must be set in .env");
let public_key = std::env::var("VAPID_PUBLIC_KEY").expect("VAPID_PUBLIC_KEY must be set in .env");
let email = std::env::var("VAPID_EMAIL").expect("VAPID_EMAIL must be set in .env");
Self { Self {
db: None, db: None,
subscriptions: Arc::new(RwLock::new(Vec::new())), subscriptions: Arc::new(RwLock::new(Vec::new())),
vapid_config: VapidConfig {
private_key,
public_key,
email,
},
} }
} }
@@ -47,9 +65,18 @@ impl PushSubscriptionStore {
} }
tracing::info!("Loaded {} push subscriptions from database", subscriptions_vec.len()); tracing::info!("Loaded {} push subscriptions from database", subscriptions_vec.len());
let private_key = std::env::var("VAPID_PRIVATE_KEY").expect("VAPID_PRIVATE_KEY must be set in .env");
let public_key = std::env::var("VAPID_PUBLIC_KEY").expect("VAPID_PUBLIC_KEY must be set in .env");
let email = std::env::var("VAPID_EMAIL").expect("VAPID_EMAIL must be set in .env");
Ok(Self { Ok(Self {
db: Some(db.clone()), db: Some(db.clone()),
subscriptions: Arc::new(RwLock::new(subscriptions_vec)), subscriptions: Arc::new(RwLock::new(subscriptions_vec)),
vapid_config: VapidConfig {
private_key,
public_key,
email,
},
}) })
} }
@@ -91,6 +118,10 @@ impl PushSubscriptionStore {
pub async fn get_all_subscriptions(&self) -> Vec<PushSubscription> { pub async fn get_all_subscriptions(&self) -> Vec<PushSubscription> {
self.subscriptions.read().await.clone() self.subscriptions.read().await.clone()
} }
pub fn get_public_key(&self) -> &str {
&self.vapid_config.public_key
}
} }
/// Send push notification to all subscribed clients /// Send push notification to all subscribed clients
@@ -116,50 +147,68 @@ pub async fn send_push_notification(
"tag": "vibetorrent" "tag": "vibetorrent"
}); });
let client = HyperWebPushClient::new(); let client = Arc::new(HyperWebPushClient::new());
let vapid_config = store.vapid_config.clone();
let payload_str = payload.to_string();
let vapid_private_key = std::env::var("VAPID_PRIVATE_KEY").expect("VAPID_PRIVATE_KEY must be set in .env"); // Send notifications concurrently
let vapid_email = std::env::var("VAPID_EMAIL").expect("VAPID_EMAIL must be set in .env"); futures::stream::iter(subscriptions)
.for_each_concurrent(10, |subscription| {
let client = client.clone();
let vapid_config = vapid_config.clone();
let payload_str = payload_str.clone();
for subscription in subscriptions { async move {
let subscription_info = SubscriptionInfo { let subscription_info = SubscriptionInfo {
endpoint: subscription.endpoint.clone(), endpoint: subscription.endpoint.clone(),
keys: web_push::SubscriptionKeys { keys: web_push::SubscriptionKeys {
p256dh: subscription.keys.p256dh.clone(), p256dh: subscription.keys.p256dh.clone(),
auth: subscription.keys.auth.clone(), auth: subscription.keys.auth.clone(),
}, },
}; };
let mut sig_builder = VapidSignatureBuilder::from_base64( let sig_res = VapidSignatureBuilder::from_base64(
&vapid_private_key, &vapid_config.private_key,
web_push::URL_SAFE_NO_PAD, web_push::URL_SAFE_NO_PAD,
&subscription_info, &subscription_info,
)?; );
sig_builder.add_claim("sub", vapid_email.as_str()); match sig_res {
sig_builder.add_claim("aud", subscription.endpoint.as_str()); Ok(mut sig_builder) => {
let signature = sig_builder.build()?; sig_builder.add_claim("sub", vapid_config.email.as_str());
sig_builder.add_claim("aud", subscription.endpoint.as_str());
match sig_builder.build() {
Ok(signature) => {
let mut builder = WebPushMessageBuilder::new(&subscription_info);
builder.set_vapid_signature(signature);
builder.set_payload(web_push::ContentEncoding::Aes128Gcm, payload_str.as_bytes());
let mut builder = WebPushMessageBuilder::new(&subscription_info); match builder.build() {
builder.set_vapid_signature(signature); Ok(msg) => {
match client.send(msg).await {
let payload_str = payload.to_string(); Ok(_) => {
builder.set_payload(web_push::ContentEncoding::Aes128Gcm, payload_str.as_bytes()); tracing::debug!("Push notification sent to: {}", subscription.endpoint);
}
match client.send(builder.build()?).await { Err(e) => {
Ok(_) => { tracing::error!("Failed to send push notification to {}: {}", subscription.endpoint, e);
tracing::debug!("Push notification sent to: {}", subscription.endpoint); }
}
}
Err(e) => tracing::error!("Failed to build push message: {}", e),
}
}
Err(e) => tracing::error!("Failed to build VAPID signature: {}", e),
}
}
Err(e) => tracing::error!("Failed to create VAPID signature builder: {}", e),
}
} }
Err(e) => { })
tracing::error!("Failed to send push notification: {}", e); .await;
// TODO: Remove invalid subscriptions
} Ok(())
}
} }
Ok(())
}
pub fn get_vapid_public_key() -> String {
std::env::var("VAPID_PUBLIC_KEY").expect("VAPID_PUBLIC_KEY must be set in .env")
}

View File

@@ -20,6 +20,8 @@ RUN apt-get update && apt-get install -y \
jq \ jq \
# Needed for some crate compilations # Needed for some crate compilations
protobuf-compiler \ protobuf-compiler \
# Install binaryen to have wasm-opt available system-wide
binaryen \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# 2. Install Node.js v20 (Manual install to support multi-arch cleanly) # 2. Install Node.js v20 (Manual install to support multi-arch cleanly)
@@ -70,7 +72,7 @@ RUN . "$HOME/.cargo/env" && \
ARCH=$(dpkg --print-architecture) && \ ARCH=$(dpkg --print-architecture) && \
if [ "$ARCH" = "amd64" ]; then TRUNK_ARCH="x86_64-unknown-linux-gnu"; \ if [ "$ARCH" = "amd64" ]; then TRUNK_ARCH="x86_64-unknown-linux-gnu"; \
elif [ "$ARCH" = "arm64" ]; then TRUNK_ARCH="aarch64-unknown-linux-gnu"; fi && \ elif [ "$ARCH" = "arm64" ]; then TRUNK_ARCH="aarch64-unknown-linux-gnu"; fi && \
wget -qO- "https://github.com/trunk-rs/trunk/releases/download/v0.21.5/trunk-$TRUNK_ARCH.tar.gz" | tar -xzf - -C /root/.cargo/bin/ && \ wget -qO- "https://github.com/trunk-rs/trunk/releases/download/v0.21.14/trunk-$TRUNK_ARCH.tar.gz" | tar -xzf - -C /root/.cargo/bin/ && \
chmod +x /root/.cargo/bin/trunk && \ chmod +x /root/.cargo/bin/trunk && \
# Install wasm-bindgen-cli (Compiling from source to avoid glibc issues, doing it ONCE here) # Install wasm-bindgen-cli (Compiling from source to avoid glibc issues, doing it ONCE here)
cargo install wasm-bindgen-cli --version 0.2.108 cargo install wasm-bindgen-cli --version 0.2.108

View File

@@ -51,4 +51,4 @@ web-sys = { version = "0.3", features = [
] } ] }
shared = { path = "../shared" } shared = { path = "../shared" }
tailwind_fuse = "0.3.2" tailwind_fuse = "0.3.2"
js-sys = "0.3.85" js-sys = "0.3.85"

View File

@@ -86,12 +86,15 @@
id="app-loading" id="app-loading"
style=" style="
display: flex; display: flex;
flex-direction: column;
justify-content: center; justify-content: center;
align-items: center; align-items: center;
height: 100vh; height: 100vh;
font-family: sans-serif;
" "
> >
<div <div
id="app-loading-spinner"
style=" style="
width: 40px; width: 40px;
height: 40px; height: 40px;
@@ -102,6 +105,32 @@
opacity: 0.5; opacity: 0.5;
" "
></div> ></div>
<div
id="app-loading-error"
style="display: none; text-align: center; margin-top: 20px; padding: 0 20px"
>
<p style="color: #ef4444; font-weight: bold; margin-bottom: 8px">
Uygulama yüklenemedi
</p>
<p style="font-size: 14px; opacity: 0.7">
Bağlantınız yavaş olabilir veya bir sistem hatası oluşmuş olabilir.
</p>
<button
onclick="location.reload()"
style="
margin-top: 16px;
padding: 8px 16px;
background: #3b82f6;
color: white;
border: none;
border-radius: 6px;
cursor: pointer;
font-weight: 500;
"
>
Sayfayı Yenile
</button>
</div>
</div> </div>
<style> <style>
@keyframes spin { @keyframes spin {
@@ -114,6 +143,34 @@
display: none !important; display: none !important;
} }
</style> </style>
<script>
// App loading timeout handler
(function () {
var timeout = setTimeout(function () {
if (!document.body.classList.contains("app-loaded")) {
var spinner = document.getElementById("app-loading-spinner");
var error = document.getElementById("app-loading-error");
if (spinner) spinner.style.display = "none";
if (error) error.style.display = "block";
}
}, 15000); // 15 seconds timeout
// Clean up timeout if app loads
var observer = new MutationObserver(function (mutations) {
mutations.forEach(function (mutation) {
if (
mutation.attributeName === "class" &&
document.body.classList.contains("app-loaded")
) {
clearTimeout(timeout);
observer.disconnect();
}
});
});
observer.observe(document.body, { attributes: true });
})();
</script>
<!-- Service Worker Registration & PWA Setup --> <!-- Service Worker Registration & PWA Setup -->
<script> <script>