Ris2Xml conversion tool

Загрузить:

Ris2Xml
WinRis2Xml
.NET Framework 2.0

Исходники:

Ris2Xml
WinRis2Xml


SourceForge.net Logo

RIS формат служит для представления библиографических ссылок, и часто используется в научной периодике. Например, все ссылки на статьи с сайта www.sciencedirect.com, представлены в RIS формате. Просматривать RIS-файлы не очень удобно, и мне захотелось упростить этот процесс, написав конвертер в удобный и понятный XML формат, с возможностью генерации HTML списков.
На сегодняшний день, существует две версии программы-конвертера, консольная и Win-forms версия. Обе были написаны мной для собственных нужд, и активно используются.

Требования:
Программа написана на языке C# для платформы .NET 2.0, поэтому для ее функционирования необходимо загрузить .NET Framework 2.0.
Основные возможности:
  • Конвертация файла (группы файлов) из Ris формата в XML.
  • Возможность генерации HTML списка.
Пример работы:

Исходный RIS файл (фрагмент):

TY - JOUR
T1 - Memory capacity in neural network models: Rigorous lower bounds
JO - Neural Networks
VL - 1
IS - 3
SP - 223
EP - 238
PY - 1988
AU - Newman, Charles M.
UR - http://www.sciencedirect.com/science/article/B6T08-482R8NT-4/2/666a550edeb8f860f5965e02fe0075a9
AB - We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction [delta] of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l >= 3: 1. 1. m can grow as fast as [alpha]Nl-1.2. 2. [alpha] can be as large as Bl/ln(1/[delta]) as [delta] --> 0.3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small [alpha].These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) [middle dot] [alpha] represents the number of stored bits per distinct synapse. The constant (l!) [middle dot] Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l --> [infinity]. We obtain rigorous lower bounds for the threshold value (l!) [middle dot] [alpha]c (the maximum possible value of (l!) [middle dot] [alpha] with [delta] unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l --> [infinity], the bound tends to zero as fast as (l!) [middle dot] Bl.
ER -

Полученный XML файл (фрагмент):

<?xml version="1.0"?>
<Records>
<Record>
<TypeOfReference>JOUR</TypeOfReference>
<TitlePrimary>Memory capacity in neural network models: Rigorous lower bounds</TitlePrimary>
<Journal>Neural Networks</Journal>
<Volume>1</Volume>
<Issue>3</Issue>
<StartPage>223</StartPage>
<EndPage>238</EndPage>
<PublicationYear>1988</PublicationYear>
<Authors>Newman, Charles M.</Authors>
<Url>http://www.sciencedirect.com/science/article/B6T08-482R8NT-4/2/666a550edeb8f860f5965e02fe0075a9</Url>
<Abstract>We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction [delta] of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l &gt;= 3: 1. 1. m can grow as fast as [alpha]Nl-1.2. 2. [alpha] can be as large as Bl/ln(1/[delta]) as [delta] --&gt; 0.3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small [alpha].These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) [middle dot] [alpha] represents the number of stored bits per distinct synapse. The constant (l!) [middle dot] Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l --&gt; [infinity]. We obtain rigorous lower bounds for the threshold value (l!) [middle dot] [alpha]c (the maximum possible value of (l!) [middle dot] [alpha] with [delta] unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l --&gt; [infinity], the bound tends to zero as fast as (l!) [middle dot] Bl. </Abstract>
</Record>
</Records>

Полученный HTML файл (фрагмент):

Newman, Charles M.Memory capacity in neural network models: Rigorous lower bounds Vol. 1, Issue # 3, 1988, pp. 223- 238
URL: http://www.sciencedirect.com/science/article/B6T08-482R8NT-4/2/666a550edeb8f860f5965e02fe0075a9
Abstract:
We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction [delta] of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l >= 3: 1. 1. m can grow as fast as [alpha]Nl-1.2. 2. [alpha] can be as large as Bl/ln(1/[delta]) as [delta] --> 0.3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small [alpha].These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) [middle dot] [alpha] represents the number of stored bits per distinct synapse. The constant (l!) [middle dot] Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l --> [infinity]. We obtain rigorous lower bounds for the threshold value (l!) [middle dot] [alpha]c (the maximum possible value of (l!) [middle dot] [alpha] with [delta] unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l --> [infinity], the bound tends to zero as fast as (l!) [middle dot] Bl.

ris2xml screenshot

 

winris2xml screenshot
 
Copyright (c) 2005 Bashir Magomedov
http://shico.blogspot.com