const currentPage = dv.current().file;
const dailyPages = dv.pages('"0-Daily"').sort(k=>k.file.name, "asc");
const currentPageName = currentPage.name;
const index = dailyPages.findIndex((e) => {return e.file.name === currentPageName});
if (index < 1) {
dv.table(["File", "Created", "Size"],[]);
} else {
const lastIndex = index - 1;
const lastPage = dailyPages[lastIndex].file;
const allPages = dv.pages().values;
const searchPages = [];
const lastTime = dv.parse(lastPage.name);
const currentTime = dv.parse(currentPage.name);
debugger;
for (let page of allPages) {
const pageFile = page.file;
if (pageFile.cday > lastTime && pageFile.cday <= currentTime) {
searchPages.push(pageFile);
}
}
dv.table(["File", "Created", "Size"], searchPages.sort((a, b) => a.ctime > b.ctime ? 1 : -1).map(b => [b.link, b.ctime, b.size]));
}
探索上下文更多的模型
在网上冲浪的时候,发现 Mosaic-ML , 官方介绍 里讲
Licensed for commercial use (unlike LLaMA).
Trained on a large amount of data (1T tokens like LLaMA vs. 300B for Pythia, 300B for OpenLLaMA, and 800B for StableLM).
Prepared to handle extremely long inputs thanks toALiBi (we trained on up to 65k inputs and can handle up to 84k vs. 2k-4k for other open source models).
Optimized for fast training and inference (viaFlashAttention andFasterTransformer )
Equipped with highly efficient open-source training code.
能够支持最多 65k 的输入。比现在的 4k 多得多得多。
在 claude 目前没有 API 的情况下,尝试一下。
试用
MPT-7B
https://huggingface.co/spaces/mosaicml/mpt-7b-chat
没有反应
chatgpt
脚本
https://colab.research.google.com/drive/1nDVSUEoW5lsmjiVCpHokP15ozc4Jqj-i?usp=share_link
响应
简单试验了一下, 效果不好。这种简单的总结都总结不出来。
总结
还是商业开源的好, GPT-3.5-turbo 效果应该是比开源的好得多的多。
后面等等 Claude 确认下吧。