Search results for: browser forensics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 112

Search results for: browser forensics

22 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: database, forensic genetics, genetic analysis, sample management, software solution

Procedia PDF Downloads 370
21 Preparedness for Microbial Forensics Evidence Collection on Best Practice

Authors: Victor Ananth Paramananth, Rashid Muniginin, Mahaya Abd Rahman, Siti Afifah Ismail

Abstract:

Safety issues, scene protection, and appropriate evidence collection must be handled in any bio crime scene. There will be a scene or multi-scene to be cordoned for investigation in any bio-incident or bio crime event. Evidence collection is critical in determining the type of microbial or toxin, its lethality, and its source. As a consequence, from the start of the investigation, a proper sampling method is required. The most significant challenges for the crime scene officer would be deciding where to obtain samples, the best sampling method, and the sample sizes needed. Since there could be evidence in liquid, viscous, or powder shape at a crime scene, crime scene officers have difficulty determining which tools to use for sampling. To maximize sample collection, the appropriate tools for sampling methods are necessary. This study aims to assist the crime scene officer in collecting liquid, viscous, and powder biological samples in sufficient quantity while preserving sample quality. Observational tests on sample collection using liquid, viscous, and powder samples for adequate quantity and sample quality were performed using UV light in this research. The density of the light emission varies upon the method of collection and sample types. The best tools for collecting sufficient amounts of liquid, viscous, and powdered samples can be identified by observing UV light. Instead of active microorganisms, the invisible powder is used to assess sufficient sample collection during a crime scene investigation using various collection tools. The liquid, powdered and viscous samples collected using different tools were analyzed using Fourier transform infrared - attenuate total reflection (FTIR-ATR). FTIR spectroscopy is commonly used for rapid discrimination, classification, and identification of intact microbial cells. The liquid, viscous and powdered samples collected using various tools have been successfully observed using UV light. Furthermore, FTIR-ATR analysis showed that collected samples are sufficient in quantity while preserving their quality.

Keywords: biological sample, crime scene, collection tool, UV light, forensic

Procedia PDF Downloads 195
20 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life

Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr

Abstract:

Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.

Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology

Procedia PDF Downloads 296
19 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks

Authors: Ahmed Abdullah Ahmed

Abstract:

The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.

Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments

Procedia PDF Downloads 512
18 The Impact of the COVID-19 on the Cybercrimes in Hungary and the Possible Solutions for Prevention

Authors: László Schmidt

Abstract:

Technological and digital innovation is constantly and dynamically evolving, which poses an enormous challenge to both lawmaking and law enforcement. To legislation because artificial intelligence permeates many areas of people’s daily lives that the legislator must regulate. it can see how challenging it is to regulate e.g. self-driving cars/taxis/camions etc. Not to mention cryptocurrencies and Chat GPT, the use of which also requires legislative intervention. Artificial intelligence also poses an extraordinary challenge to law enforcement. In criminal cases, police and prosecutors can make great use of AI in investigations, e.g. in forensics, DNA samples, reconstruction, identification, etc. But it can also be of great help in the detection of crimes committed in cyberspace. In the case of cybercrime, on the one hand, it can be viewed as a new type of crime that can only be committed with the help of information systems, and that has a specific protected legal object, such as an information system or data. On the other hand, it also includes traditional crimes that are much easier to commit with the help of new tools. According to Hungarian Criminal Code section 375 (1), any person who, for unlawful financial gain, introduces data into an information system, or alters or deletes data processed therein, or renders data inaccessible, or otherwise interferes with the functioning of the information system, and thereby causes damage, is guilty of a felony punishable by imprisonment not exceeding three years. The Covid-19 coronavirus epidemic has had a significant impact on our lives and our daily lives. It was no different in the world of crime. With people staying at home for months, schools, restaurants, theatres, cinemas closed, and no travel, criminals have had to change their ways. Criminals were committing crimes online in even greater numbers than before. These crimes were very diverse, ranging from false fundraising, the collection and misuse of personal data, extortion to fraud on various online marketplaces. The most vulnerable age groups (minors and elderly) could be made more aware and prevented from becoming victims of this type of crime through targeted programmes. The aim of the study is to show the Hungarian judicial practice in relation to cybercrime and possible preventive solutions.

Keywords: cybercrime, COVID-19, Hungary, criminal law

Procedia PDF Downloads 60
17 A Study of Anthropometric Correlation between Upper and Lower Limb Dimensions in Sudanese Population

Authors: Altayeb Abdalla Ahmed

Abstract:

Skeletal phenotype is a product of a balanced interaction between genetics and environmental factors throughout different life stages. Therefore, interlimb proportions are variable between populations. Although interlimb proportion indices have been used in anthropology in assessing the influence of various environmental factors on limbs, an extensive literature review revealed that there is a paucity of published research assessing interlimb part correlations and possibility of reconstruction. Hence, this study aims to assess the relationships between upper and lower limb parts and develop regression formulae to reconstruct the parts from one another. The left upper arm length, ulnar length, wrist breadth, hand length, hand breadth, tibial length, bimalleolar breadth, foot length, and foot breadth of 376 right-handed subjects, comprising 187 males and 189 females (aged 25-35 years), were measured. Initially, the data were analyzed using basic univariate analysis and independent t-tests; then sex-specific simple and multiple linear regression models were used to estimate upper limb parts from lower limb parts and vice-versa. The results of this study indicated significant sexual dimorphism for all variables. The results indicated a significant correlation between the upper and lower limbs parts (p < 0.01). Linear and multiple (stepwise) regression equations were developed to reconstruct the limb parts in the presence of a single or multiple dimension(s) from the other limb. Multiple stepwise regression equations generated better reconstructions than simple equations. These results are significant in forensics as it can aid in identification of multiple isolated limb parts particularly during mass disasters and criminal dismemberment. Although a DNA analysis is the most reliable tool for identification, its usage has multiple limitations in undeveloped countries, e.g., cost, facility availability, and trained personnel. Furthermore, it has important implication in plastic and orthopedic reconstructive surgeries. This study is the only reported study assessing the correlation and prediction capabilities between many of the upper and lower dimensions. The present study demonstrates a significant correlation between the interlimb parts in both sexes, which indicates a possibility to reconstruction using regression equations.

Keywords: anthropometry, correlation, limb, Sudanese

Procedia PDF Downloads 295
16 Repeatable Surface Enhanced Raman Spectroscopy Substrates from SERSitive for Wide Range of Chemical and Biological Substances

Authors: Monika Ksiezopolska-Gocalska, Pawel Albrycht, Robert Holyst

Abstract:

Surface Enhanced Raman Spectroscopy (SERS) is a technique used to analyze very low concentrations of substances in solutions, even in aqueous solutions - which is its advantage over IR. This technique can be used in the pharmacy (to check the purity of products); forensics (whether at a crime scene there were any illegal substances); or medicine (serving as a medical test) and lots more. Due to the high potential of this technique, its increasing popularity in analytical laboratories, and simultaneously - the absence of appropriate platforms enhancing the SERS signal (crucial to observe the Raman effect at low analyte concentration in solutions (1 ppm)), we decided to invent our own SERS platforms. As an enhancing layer, we have chosen gold and silver nanoparticles, because these two have the best SERS properties, and each has an affinity for the other kind of particles, which increases the range of research capabilities. The next step was to commercialize them, which resulted in the creation of the company ‘SERSitive.eu’ focusing on production of highly sensitive (Ef = 10⁵ – 10⁶), homogeneous and reproducible (70 - 80%) substrates. SERStive SERS substrates are made using the electrodeposition of silver or silver-gold nanoparticles technique. Thanks to a very detailed analysis of data based on studies optimizing such parameters as deposition time, temperature of the reaction solution, applied potential, used reducer, or reagent concentrations using a standardized compound - p-mercaptobenzoic acid (PMBA) at a concentration of 10⁻⁶ M, we have developed a high-performance process for depositing precious metal nanoparticles on the surface of ITO glass. In order to check a quality of the SERSitive platforms, we examined the wide range of the chemical compounds and the biological substances. Apart from analytes that have great affinity to the metal surfaces (e.g. PMBA) we obtained very good results for those fitting less the SERS measurements. Successfully we received intensive, and what’s more important - very repetitive spectra for; amino acids (phenyloalanine, 10⁻³ M), drugs (amphetamine, 10⁻⁴ M), designer drugs (cathinone derivatives, 10⁻³ M), medicines and ending with bacteria (Listeria, Salmonella, Escherichia coli) and fungi.

Keywords: nanoparticles, Raman spectroscopy, SERS, SERS applications, SERS substrates, SERSitive

Procedia PDF Downloads 151
15 Exploiting the Potential of Fabric Phase Sorptive Extraction for Forensic Food Safety: Analysis of Food Samples in Cases of Drug Facilitated Crimes

Authors: Bharti Jain, Rajeev Jain, Abuzar Kabir, Torki Zughaibi, Shweta Sharma

Abstract:

Drug-facilitated crimes (DFCs) entail the use of a single drug or a mixture of drugs to render a victim unable. Traditionally, biological samples have been gathered from victims and conducted analysis to establish evidence of drug administration. Nevertheless, the rapid metabolism of various drugs and delays in analysis can impede the identification of such substances. For this, the present article describes a rapid, sustainable, highly efficient and miniaturized protocol for the identification and quantification of three sedative-hypnotic drugs, namely diazepam, chlordiazepoxide and ketamine in alcoholic beverages and complex food samples (cream of biscuit, flavored milk, juice, cake, tea, sweets and chocolate). The methodology involves utilizing fabric phase sorptive extraction (FPSE) to extract diazepam (DZ), chlordiazepoxide (CDP), and ketamine (KET). Subsequently, the extracted samples are subjected to analysis using gas chromatography-mass spectrometry (GC-MS). Several parameters, including the type of membrane, pH, agitation time and speed, ionic strength, sample volume, elution volume and time, and type of elution solvent, were screened and thoroughly optimized. Sol-gel Carbowax 20M (CW-20M) has demonstrated the most effective extraction efficiency for the target analytes among all evaluated membranes. Under optimal conditions, the method displayed linearity within the range of 0.3–10 µg mL–¹ (or µg g–¹), exhibiting a coefficient of determination (R2) ranging from 0.996–0.999. The limits of detection (LODs) and limits of quantification (LOQs) for liquid samples range between 0.020-0.069 µg mL-¹ and 0.066-0.22 µg mL-¹, respectively. Correspondingly, the LODs for solid samples ranged from 0.056-0.090 µg g-¹, while the LOQs ranged from 0.18-0.29 µg g-¹. Notably, the method showcased better precision, with repeatability and reproducibility both below 5% and 10%, respectively. Furthermore, the FPSE-GC-MS method proved effective in determining diazepam (DZ) in forensic food samples connected to drug-facilitated crimes (DFCs). Additionally, the proposed method underwent evaluation for its whiteness using the RGB12 algorithm.

Keywords: drug facilitated crime, fabric phase sorptive extraction, food forensics, white analytical chemistry

Procedia PDF Downloads 70
14 Estimating the Ladder Angle and the Camera Position From a 2D Photograph Based on Applications of Projective Geometry and Matrix Analysis

Authors: Inigo Beckett

Abstract:

In forensic investigations, it is often the case that the most potentially useful recorded evidence derives from coincidental imagery, recorded immediately before or during an incident, and that during the incident (e.g. a ‘failure’ or fire event), the evidence is changed or destroyed. To an image analysis expert involved in photogrammetric analysis for Civil or Criminal Proceedings, traditional computer vision methods involving calibrated cameras is often not appropriate because image metadata cannot be relied upon. This paper presents an approach for resolving this problem, considering in particular and by way of a case study, the angle of a simple ladder shown in a photograph. The UK Health and Safety Executive (HSE) guidance document published in 2014 (INDG455) advises that a leaning ladder should be erected at 75 degrees to the horizontal axis. Personal injury cases can arise in the construction industry because a ladder is too steep or too shallow. Ad-hoc photographs of such ladders in their incident position provide a basis for analysis of their angle. This paper presents a direct approach for ascertaining the position of the camera and the angle of the ladder simultaneously from the photograph(s) by way of a workflow that encompasses a novel application of projective geometry and matrix analysis. Mathematical analysis shows that for a given pixel ratio of directly measured collinear points (i.e. features that lie on the same line segment) from the 2D digital photograph with respect to a given viewing point, we can constrain the 3D camera position to a surface of a sphere in the scene. Depending on what we know about the ladder, we can enforce another independent constraint on the possible camera positions which enables us to constrain the possible positions even further. Experiments were conducted using synthetic and real-world data. The synthetic data modeled a vertical plane with a ladder on a horizontally flat plane resting against a vertical wall. The real-world data was captured using an Apple iPhone 13 Pro and 3D laser scan survey data whereby a ladder was placed in a known location and angle to the vertical axis. For each case, we calculated camera positions and the ladder angles using this method and cross-compared them against their respective ‘true’ values.

Keywords: image analysis, projective geometry, homography, photogrammetry, ladders, Forensics, Mathematical modeling, planar geometry, matrix analysis, collinear, cameras, photographs

Procedia PDF Downloads 52
13 STML: Service Type-Checking Markup Language for Services of Web Components

Authors: Saqib Rasool, Adnan N. Mian

Abstract:

Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.

Keywords: REST, STML, type checking, web component

Procedia PDF Downloads 254
12 Persistent Ribosomal In-Frame Mis-Translation of Stop Codons as Amino Acids in Multiple Open Reading Frames of a Human Long Non-Coding RNA

Authors: Leonard Lipovich, Pattaraporn Thepsuwan, Anton-Scott Goustin, Juan Cai, Donghong Ju, James B. Brown

Abstract:

Two-thirds of human genes do not encode any known proteins. Aside from long non-coding RNA (lncRNA) genes with recently-discovered functions, the ~40,000 non-protein-coding human genes remain poorly understood, and a role for their transcripts as de-facto unconventional messenger RNAs has not been formally excluded. Ribosome profiling (Riboseq) predicts translational potential, but without independent evidence of proteins from lncRNA open reading frames (ORFs), ribosome binding of lncRNAs does not prove translation. Previously, we mass-spectrometrically documented translation of specific lncRNAs in human K562 and GM12878 cells. We now examined lncRNA translation in human MCF7 cells, integrating strand-specific Illumina RNAseq, Riboseq, and deep mass spectrometry in biological quadruplicates performed at two core facilities (BGI, China; City of Hope, USA). We excluded known-protein matches. UCSC Genome Browser-assisted manual annotation of imperfect (tryptic-digest-peptides)-to-(lncRNA-three-frame-translations) alignments revealed three peptides hypothetically explicable by 'stop-to-nonstop' in-frame replacement of stop codons by amino acids in two ORFs of the lncRNA MMP24-AS1. To search for this phenomenon genomewide, we designed and implemented a novel pipeline, matching tryptic-digest spectra to wildcard-instead-of-stop versions of repeat-masked, six-frame, whole-genome translations. Along with singleton putative stop-to-nonstop events affecting four other lncRNAs, we identified 24 additional peptides with stop-to-nonstop in-frame substitutions from multiple positive-strand MMP24-AS1 ORFs. Only UAG and UGA, never UAA, stop codons were impacted. All MMP24-AS1-matching spectra met the same significance thresholds as high-confidence known-protein signatures. Targeted resequencing of MMP24-AS1 genomic DNA and cDNA from the same samples did not reveal any mutations, polymorphisms, or sequencing-detectable RNA editing. This unprecedented apparent gene-specific violation of the genetic code highlights the importance of matching peptides to whole-genome, not known-genes-only, ORFs in mass-spectrometry workflows, and suggests a new mechanism enhancing the combinatorial complexity of the proteome. Funding: NIH Director’s New Innovator Award 1DP2-CA196375 to LL.

Keywords: genetic code, lncRNA, long non-coding RNA, mass spectrometry, proteogenomics, ribo-seq, ribosome, RNAseq

Procedia PDF Downloads 235
11 Opportunities Forensics Biology in the Study of Sperm Traces after Washing

Authors: Saule Musabekova

Abstract:

Achievements of modern science, especially genetics, led to a sharp intensification of the process of proof. Footprints, subjected to destruction-related cause-effect relationships, are sources of evidentiary information on the circumstances it was committed and the persons committed it. Currently, with the overall growth in the number of crimes against sexual inviolability or sexual freedom, and increased the proportion of the crimes where to destroy the traces of the crime perpetrators different detergents are used. A characteristic feature of modern synthetic detergents is the presence of biological additives - enzymes that break down and gradually destroy stains of protein origin. To study the nature of the influence of modern washing powders semen stains were put kinds of fabrics and prepared in advance stained sperm of men of different groups according to ABO system. For research washing machines of known manufacturers of household appliances have been used with different production characteristics, in which the test was performed and the washing of various kinds of fabrics with semen stains. After washing the tissue with spots were tested for the presence of semen stains visually preserved, establishing in them surviving sperm or their elements, we studied the possibilities of the group diagnostics on the system ABO or molecular-genetic identification. The subsequent study of these spots by morphological method showed that 100% detection of morphological sperm cells - sperm is not possible. As a result, in 30% of further studies of these traces gave weakly positive results are obtained with an immunoassay test PSA SEMIQUANT. It is noted that the percentage of positive results obtained in the study of semen traces disposed on natural fiber fabrics is higher than sperm traces disposed on synthetic fabrics. Study traces of semen, confirmed by PSA - test 3% possible to establish a genetic profile of the person and obtain any positive findings of the molecular genetic examination. In other cases, it was not a sufficient amount of material for DNA identification. Results of research and the practical expert study found, in most cases, the conclusions of the identification of sperm traces do not seem possible. This a consequence of exposure to semen traces on the material evidence of biological additives contained in modern detergents and further the influence of other effective methods. Resulting in DNA has undergone irreversible changes (degradation) under the influence of external human factors. Using molecular genetic methods can partially solve the problems arising in the study of unlaundered physical evidence for the disclosure and investigation of crimes.

Keywords: study of sperm, modern detergents, washing powders, forensic medicine

Procedia PDF Downloads 298
10 Automated System: Managing the Production and Distribution of Radiopharmaceuticals

Authors: Shayma Mohammed, Adel Trabelsi

Abstract:

Radiopharmacy is the art of preparing high-quality, radioactive, medicinal products for use in diagnosis and therapy. Radiopharmaceuticals unlike normal medicines, this dual aspect (radioactive, medical) makes their management highly critical. One of the most convincing applications of modern technologies is the ability to delegate the execution of repetitive tasks to programming scripts. Automation has found its way to the most skilled jobs, to improve the company's overall performance by allowing human workers to focus on more important tasks than document filling. This project aims to contribute to implement a comprehensive system to insure rigorous management of radiopharmaceuticals through the use of a platform that links the Nuclear Medicine Service Management System to the Nuclear Radio-pharmacy Management System in accordance with the recommendations of World Health Organization (WHO) and International Atomic Energy Agency (IAEA). In this project we attempt to build a web application that targets radiopharmacies, the platform is built atop the inherently compatible web stack which allows it to work in virtually any environment. Different technologies are used in this project (PHP, Symfony, MySQL Workbench, Bootstrap, Angular 7, Visual Studio Code and TypeScript). The operating principle of the platform is mainly based on two parts: Radiopharmaceutical Backoffice for the Radiopharmacian, who is responsible for the realization of radiopharmaceutical preparations and their delivery and Medical Backoffice for the Doctor, who holds the authorization for the possession and use of radionuclides and he/she is responsible for ordering radioactive products. The application consists of sven modules: Production, Quality Control/Quality Assurance, Release, General Management, References, Transport and Stock Management. It allows 8 classes of users: The Production Manager (PM), Quality Control Manager (QCM), Stock Manager (SM), General Manager (GM), Client (Doctor), Parking and Transport Manager (PTM), Qualified Person (QP) and Technical and Production Staff. Digital platform bringing together all players involved in the use of radiopharmaceuticals and integrating the stages of preparation, production and distribution, Web technologies, in particular, promise to offer all the benefits of automation while requiring no more than a web browser to act as a user client, which is a strength because the web stack is by nature multi-platform. This platform will provide a traceability system for radiopharmaceuticals products to ensure the safety and radioprotection of actors and of patients. The new integrated platform is an alternative to write all the boilerplate paperwork manually, which is a tedious and error-prone task. It would minimize manual human manipulation, which has proven to be the main source of error in nuclear medicine. A codified electronic transfer of information from radiopharmaceutical preparation to delivery will further reduce the risk of maladministration.

Keywords: automated system, management, radiopharmacy, technical papers

Procedia PDF Downloads 156
9 Forensic Nursing in the Emergency Department: The Overlooked Roles

Authors: E. Tugba Topcu

Abstract:

The emergency services are usually the first places to encounter forensic cases. Hence, it is important to consider forensics from the perspective of the emergency services staff and the physiological and psychological consequences that may arise as a result of behaviour by itself or another person. Accurate and detailed documentation of the situation in which the patient first arrives at the emergency service and preservation of the forensic findings is pivotal for the subsequent forensic investigation. The first step in determining whether or not a forensic case exists is to perform a medical examination of the patient. For each individual suspected to be part of a forensic case, police officers should be informed at the same time as the medical examination is being conducted. Violent events are increasing every year and with an increase in the number of forensic cases, emergency service workers have increasing responsibility and consequently play a key role in protecting, collecting and arranging the forensic evidence. In addition, because the emergency service workers involved in forensic events typically have information about the accused and/or victim, as well as evidence related to the events and the cause of injuries, police officers often require their testimony. However, both nurses and other health care personnel do not typically have adequate expertise in forensic medicine. Emergency nurses should take an active role for determining that whether any patient admitted to the emergency services is a clinical forensic patient the emergency service with injury and requiring possible punishment and knowing of their roles and responsibilities in this area provides legal protection as well as the protection of the judicial affair. Particularly, in emergency services, where rapid patient turnover and high workload exists, patient registration and case reporting may not exist. In such instances, the witnesses, typically the nurses, are often consulted for information. Knowledge of forensic medical matters plays a vital role in achieving justice. According to the Criminal Procedure Law, Article 75, Paragraph 3, ‘an internal body examination or the taking of blood or other biological samples from the body can be performed only by a doctor or other health professional member’. In favour of this item, the clinic nurse and doctor are mainly responsible for evaluating forensic cases in emergency departments, performing the examination, collecting evidence, and storing and reporting data. The courts place considerable importance on determining whether a suspect is the victim or accused and, thus, in terms of illuminating events, it is crucial that any evidence is gathered carefully and appropriately. All the evidence related to the forensic case including the forensic report should be handed over to the police officers. In instances where forensic evidence cannot be collected and the only way to obtain the evidence is the hospital environment, health care personnel in emergency services need to have knowledge about the diagnosis of forensic evidence, the collection of evidence, hiding evidence and provision of the evidence delivery chain.

Keywords: emergency department, emergency nursing, forensic cases, forensic nursing

Procedia PDF Downloads 252
8 The Shape of the Sculptor: Exploring Psychologist’s Perceptions of a Model of Parenting Ability to Guide Intervention in Child Custody Evaluations in South Africa

Authors: Anthony R. Townsend, Robyn L. Fasser

Abstract:

This research project provides an interpretative phenomenological analysis of a proposed conceptual model of parenting ability that has been designed to offer recommendations to guide intervention in child custody evaluations in South Africa. A recent review of the literature on child custody evaluations reveals that while there have been significant and valuable shifts in the capacity of the legal system aided by mental health professionals in understanding children and family dynamics, there remains a conceptual gap regarding the nature of parenting ability. With a view to addressing this paucity of a theoretical basis for considering parenting ability, this research project reviews a dimensional model for the assessment of parenting ability by conceiving parenting ability as a combination of good parenting and parental fitness. This model serves as a conceptual framework to guide child-custody evaluation and refine intervention in such cases to better meet the best interests of the child in a manner that bridges the professional gap between parties, legal entities, and mental health professionals. Using a model of good parenting as a point of theoretical departure, this model incorporates both intra-psychic and interpersonal attributes and behaviours of parents to form an impression of parenting ability and identify areas for potential enhancement. This research, therefore, hopes to achieve the following: (1) to provide nuanced descriptions of parents’ parenting ability; (2) to describe parents’ parenting potential; (3) to provide a parenting assessment tool for investigators in forensic family matters that will enable more useful recommendations and interventions; (4) to develop a language of consensus for investigators, attorneys, judges and parents, in forensic family matters, as to what comprises parenting ability and how this can be assessed; and (5) that all of the aforementioned will serve to advance the best interests of the children involved in such litigious matters. The evaluative promise and post-assessment prospects of this model are illustrated through three interlinking data sets: (1) the results of interviews with South African psychologists about the model, (2) retrospective analysis of care and contact evaluation reports using the model to determine if different conclusions or more specific recommendations are generated with its use and (3) the results of an interview with a psychologist who piloted this model by using it in care and contact evaluation.

Keywords: alienation, attachment, best interests of the child, care and contact evaluation, children’s act (38 of 2005), child custody evaluation, civil forensics, gatekeeping, good parenting, good-enough parenting, health professions council of South Africa, family law, forensic mental healthcare practitioners, parental fitness, parenting ability, parent management training, parenting plan, problem-determined system, psychotherapy, support of other child-parent relationship, voice of the child

Procedia PDF Downloads 115
7 Exploring 3-D Virtual Art Spaces: Engaging Student Communities Through Feedback and Exhibitions

Authors: Zena Tredinnick-Kirby, Anna Divinsky, Brendan Berthold, Nicole Cingolani

Abstract:

Faculty members from The Pennsylvania State University, Zena Tredinnick-Kirby, Ph.D., and Anna Divinsky are at the forefront of an innovative educational approach to improve access in asynchronous online art courses. Their pioneering work weaves virtual reality (VR) technologies to construct a more equitable educational experience for students by transforming their learning and engagement. The significance of their study lies in the need to bridge the digital divide in online art courses, making them more inclusive and interactive for all distance learners. In an era where conventional classroom settings are no longer the sole means of instruction, Tredinnick-Kirby and Divinsky harness the power of instructional technologies to break down geographical barriers by incorporating an interactive VR experience that facilitates community building within an online environment transcending physical constraints. The methodology adopted by Tredinnick-Kirby, and Divinsky is centered around integrating 3D virtual spaces into their art courses. Spatial.io, a virtual world platform, enables students to develop digital avatars and engage in virtual art museums through a free browser-based program or an Oculus headset, where they can interact with other visitors and critique each other’s artwork. The goal is not only to provide students with an engaging and immersive learning experience but also to nourish them with a more profound understanding of the language of art criticism and technology. Furthermore, the study aims to cultivate critical thinking skills among students and foster a collaborative spirit. By leveraging cutting-edge VR technology, students are encouraged to explore the possibilities of their field, experimenting with innovative tools and techniques. This approach not only enriches their learning experience but also prepares them for a dynamic and ever-evolving art landscape in technology and education. One of the fundamental objectives of Tredinnick-Kirby and Divinsky is to remodel how feedback is derived through peer-to-peer art critique. Through the inclusion of 3D virtual spaces into the curriculum, students now have the opportunity to install their final artwork in a virtual gallery space and incorporate peer feedback, enabling students to exhibit their work opening the doors to a collaborative and interactive process. Students can provide constructive suggestions, engage in discussions, and integrate peer commentary into developing their ideas and praxis. This approach not only accelerates the learning process but also promotes a sense of community and growth. In summary, the study conducted by the Penn State faculty members Zena Tredinnick-Kirby, and Anna Divinsky represents innovative use of technology in their courses. By incorporating 3D virtual spaces, they are enriching the learners' experience. Through this inventive pedagogical technique, they nurture critical thinking, collaboration, and the practical application of cutting-edge technology in art. This research holds great promise for the future of online art education, transforming it into a dynamic, inclusive, and interactive experience that transcends the confines of distance learning.

Keywords: Art, community building, distance learning, virtual reality

Procedia PDF Downloads 71
6 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data

Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira

Abstract:

Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.

Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC

Procedia PDF Downloads 126
5 Functionalizing Gold Nanostars with Ninhydrin as Vehicle Molecule for Biomedical Applications

Authors: Swati Mishra

Abstract:

In recent years, there has been an explosion in Gold NanoParticle (GNP) research, with a rapid increase in publications in diverse fields, including imaging, bioengineering, and molecular biology. GNPs exhibit unique physicochemical properties, including surface plasmon resonance (SPR) and bind amine and thiol groups, allowing surface modification and use in biomedical applications. Nanoparticle functionalization is the subject of intense research at present, with rapid progress being made towards developing biocompatible, multi-functional particles. In the present study, the photochemical method has been done to functionalize various-shaped GNPs like nanostars by the molecules like ninhydrin. Ninhydrin is bactericidal, virucidal, fungicidal, antigen-antibody reactive, and used in fingerprint technology in forensics. The GNPs functionalized with ninhydrin efficiently will bind to the amino acids on the target protein, which is of eminent importance during the pandemic, especially where long-term treatments of COVID- 19 bring many side effects of the drugs. The photochemical method is adopted as it provides low thermal load, selective reactivity, selective activation, and controlled radiation in time, space, and energy. The GNPs exhibit their characteristic spectrum, but a distinctly blue or redshift in the peak will be observed after UV irradiation, ensuring efficient ninhydrin binding. Now, the bound ninhydrin in the GNP carrier, upon chemically reacting with any amino acid, will lead to the formation of Rhumann purple. A common method of GNP production includes citrate reduction of Au [III] derivatives such as aurochloric acid (HAuCl4) in water to Au [0] through a one-step synthesis of size-tunable GNPs. The following reagents are prepared to validate the approach. Reagent A solution 1 is0.0175 grams ninhydrin in 5 ml Millipore water Reagent B 30 µl of HAuCl₄.3H₂O in 3 ml of solution 1 Reagent C 1 µl of gold nanostars in 3 ml of solution 1 Reagent D 6 µl of cetrimonium bromide (CTAB) in 3 ml of solution1 ReagentE 1 µl of gold nanostars in 3 ml of ethanol ReagentF 30 µl of HAuCl₄.₃H₂O in 3 ml of ethanol ReagentG 30 µl of HAuCl₄.₃H₂O in 3 ml of solution 2 ReagentH solution 2 is0.0087 grams ninhydrin in 5 ml Millipore water ReagentI 30 µl of HAuCl₄.₃H₂O in 3 ml of water The reagents were irradiated at 254 nm for 15 minutes, followed by their UV Visible spectroscopy. The wavelength was selected based on the one reported for excitation of a similar molecule Pthalimide. It was observed that the solution B and G deviate around 600 nm, while C peaks distinctively at 567.25 nm and 983.9 nm. Though it is tough to say about the chemical reaction happening, butATR-FTIR of reagents will ensure that ninhydrin is not forming Rhumann purple in the absence of amino acids. Therefore, these experiments, we achieved the functionalization of gold nanostars with ninhydrin corroborated by the deviation in the spectrum obtained in a mixture of GNPs and ninhydrin irradiated with UV light. It prepares them as a carrier molecule totake up amino acids for targeted delivery or germicidal action.

Keywords: gold nanostars, ninhydrin, photochemical method, UV visible specgtroscopy

Procedia PDF Downloads 148
4 Biodegradation Effects onto Source Identification of Diesel Fuel Contaminated Soils

Authors: Colin S. Chen, Chien-Jung Tien, Hsin-Jan Huang

Abstract:

For weathering studies, the change of chemical constituents by biodegradation effect in diesel-contaminated soils are important factors to be considered, especially when there is a prolonged period of weathering processes. The objective was to evaluate biodegradation effects onto hydrocarbon fingerprinting and distribution patterns of diesel fuels, fuel source screening and differentiation, source-specific marker compounds, and diagnostic ratios of diesel fuel constituents by laboratory and field studies. Biodegradation processes of diesel contaminated soils were evaluated by experiments lasting for 15 and 12 months, respectively. The degradation of diesel fuel in top soils was affected by organic carbon content and biomass of microorganisms in soils. Higher depletion of total petroleum hydrocarbon (TPH), n-alkanes, and polynuclear aromatic hydrocarbons (PAHs) and their alkyl homologues was observed in soils containing higher organic carbon content and biomass. Decreased ratio of selected isoprenoids (i.e., pristane (Pr) and phytane (Ph)) including n-C17/pristane and n-C18/phytane was observed. The ratio of pristane/phytane was remained consistent for a longer period of time. At the end of the experimental period, a decrease of pristane/phytane was observed. Biomarker compounds of bicyclic sesquiterpanes (BS) were less susceptible to the effects of biodegradation. The ratios of characteristic factors such as C15 sesquiterpane/ 8β(H)-drimane (BS3/BS5), C15 sesquiterpane/ 8β(H)-drimane (BS4/BS5), 8β(H)-drimane/8β(H)-homodrimane (BS5/BS10), and C15 sesquiterpane/8β(H)-homodrimane (BS3/BS10) could be adopted for source identification of diesel fuels in top soil. However, for biodegradation processes lasted for six months but shorter than nine months, only BS3/BS5 and BS3/BS10 could be distinguished in two diesel fuels. In subsoil experiments (contaminated soil located 50 cm below), the ratios of characteristic factors including BS3/BS5, BS4/BS5, and BS5/BS10 were valid for source identification of two diesel fuels for nine month biodegradation. At the early stage of contamination, biomass of soil decreased significantly. However, 6 and 7 dominant species were found in soils in top soil experiments, respectively. With less oxygen and nutrients in subsoil, less biomass of microorganisms was observed in subsoils. Only 2 and 4 diesel-degrading species of microorganisms were identified in two soils, respectively. Parameters of double ratio such as fluorene/C1-fluorene: C2-phenanthrene/C3-phenanthrene (C0F/C1F:C2P/C3P) in both top and subsoil, C2-naphthalene/C2-phenanthrene: C1-phenanthrene/C3-phenanthrene (C2N/C2P:C1P/C3P), and C1-phenanthrene/C1-fluorene: C3-naphthalene/C3-phenanthrene (C1P/C1F:C3N/C3P) in subsoil could serve as forensic indicators in diesel contaminated sites. BS3/BS10:BS4/BS5 could be used in 6 to 9 months of biodegradation processes. Results of principal component analysis (PCA) indicated that source identification of diesel fuels in top soil could only be perofrmed for weathering process less than 6 months. For subsoil, identification can be conducted for weathering process less than 9 months. Ratio of isoprenoids (pristane and phytane) and PAHs might be affected by biodegradation in spilled sites. The ratios of bicyclic sesquiterpanes could serve as forensic indicators in diesel-contaminated soils. Finally, source identification was attemped for samples collected from different fuel contaminated sites by using the unique pattern of sesquiterpanes. It was anticipated that the information generated from this study would be adopted by decision makers to evaluate the liability of cleanup in diesel contaminated sites.

Keywords: biodegradation, diagnostic ratio, diesel fuel, environmental forensics

Procedia PDF Downloads 228
3 IEEE802.15.4e Based Scheduling Mechanisms and Systems for Industrial Internet of Things

Authors: Ho-Ting Wu, Kai-Wei Ke, Bo-Yu Huang, Liang-Lin Yan, Chun-Ting Lin

Abstract:

With the advances in advanced technology, wireless sensor network (WSN) has become one of the most promising candidates to implement the wireless industrial internet of things (IIOT) architecture. However, the legacy IEEE 802.15.4 based WSN technology such as Zigbee system cannot meet the stringent QoS requirement of low powered, real-time, and highly reliable transmission imposed by the IIOT environment. Recently, the IEEE society developed IEEE 802.15.4e Time Slotted Channel Hopping (TSCH) access mode to serve this purpose. Furthermore, the IETF 6TiSCH working group has proposed standards to integrate IEEE 802.15.4e with IPv6 protocol smoothly to form a complete protocol stack for IIOT. In this work, we develop key network technologies for IEEE 802.15.4e based wireless IIoT architecture, focusing on practical design and system implementation. We realize the OpenWSN-based wireless IIOT system. The system architecture is divided into three main parts: web server, network manager, and sensor nodes. The web server provides user interface, allowing the user to view the status of sensor nodes and instruct sensor nodes to follow commands via user-friendly browser. The network manager is responsible for the establishment, maintenance, and management of scheduling and topology information. It executes centralized scheduling algorithm, sends the scheduling table to each node, as well as manages the sensing tasks of each device. Sensor nodes complete the assigned tasks and sends the sensed data. Furthermore, to prevent scheduling error due to packet loss, a schedule inspection mechanism is implemented to verify the correctness of the schedule table. In addition, when network topology changes, the system will act to generate a new schedule table based on the changed topology for ensuring the proper operation of the system. To enhance the system performance of such system, we further propose dynamic bandwidth allocation and distributed scheduling mechanisms. The developed distributed scheduling mechanism enables each individual sensor node to build, maintain and manage the dedicated link bandwidth with its parent and children nodes based on locally observed information by exchanging the Add/Delete commands via two processes. The first process, termed as the schedule initialization process, allows each sensor node pair to identify the available idle slots to allocate the basic dedicated transmission bandwidth. The second process, termed as the schedule adjustment process, enables each sensor node pair to adjust their allocated bandwidth dynamically according to the measured traffic loading. Such technology can sufficiently satisfy the dynamic bandwidth requirement in the frequently changing environments. Last but not least, we propose a packet retransmission scheme to enhance the system performance of the centralized scheduling algorithm when the packet delivery rate (PDR) is low. We propose a multi-frame retransmission mechanism to allow every single network node to resend each packet for at least the predefined number of times. The multi frame architecture is built according to the number of layers of the network topology. Performance results via simulation reveal that such retransmission scheme is able to provide sufficient high transmission reliability while maintaining low packet transmission latency. Therefore, the QoS requirement of IIoT can be achieved.

Keywords: IEEE 802.15.4e, industrial internet of things (IIOT), scheduling mechanisms, wireless sensor networks (WSN)

Procedia PDF Downloads 160
2 Understanding New Zealand’s 19th Century Timber Churches: Techniques in Extracting and Applying Underlying Procedural Rules

Authors: Samuel McLennan, Tane Moleta, Andre Brown, Marc Aurel Schnabel

Abstract:

The development of Ecclesiastical buildings within New Zealand has produced some unique design characteristics that take influence from both international styles and local building methods. What this research looks at is how procedural modelling can be used to define such common characteristics and understand how they are shared and developed within different examples of a similar architectural style. This will be achieved through the creation of procedural digital reconstructions of the various timber Gothic Churches built during the 19th century in the city of Wellington, New Zealand. ‘Procedural modelling’ is a digital modelling technique that has been growing in popularity, particularly within the game and film industry, as well as other fields such as industrial design and architecture. Such a design method entails the creation of a parametric ‘ruleset’ that can be easily adjusted to produce many variations of geometry, rather than a single geometry as is typically found in traditional CAD software. Key precedents within this area of digital heritage includes work by Haegler, Müller, and Gool, Nicholas Webb and Andre Brown, and most notably Mark Burry. What these precedents all share is how the forms of the reconstructed architecture have been generated using computational rules and an understanding of the architects’ geometric reasoning. This is also true within this research as Gothic architecture makes use of only a select range of forms (such as the pointed arch) that can be accurately replicated using the same standard geometric techniques originally used by the architect. The methodology of this research involves firstly establishing a sample group of similar buildings, documenting the existing samples, researching any lost samples to find evidence such as architectural plans, photos, and written descriptions, and then culminating all the findings into a single 3D procedural asset within the software ‘Houdini’. The end result will be an adjustable digital model that contains all the architectural components of the sample group, such as the various naves, buttresses, and windows. These components can then be selected and arranged to create visualisations of the sample group. Because timber gothic churches in New Zealand share many details between designs, the created collection of architectural components can also be used to approximate similar designs not included in the sample group, such as designs found beyond the Wellington Region. This creates an initial library of architectural components that can be further expanded on to encapsulate as wide of a sample size as desired. Such a methodology greatly improves upon the efficiency and adjustability of digital modelling compared to current practices found in digital heritage reconstruction. It also gives greater accuracy to speculative design, as a lack of evidence for lost structures can be approximated using components from still existing or better-documented examples. This research will also bring attention to the cultural significance these types of buildings have within the local area, addressing the public’s general unawareness of architectural history that is identified in the Wellington based research ‘Moving Images in Digital Heritage’ by Serdar Aydin et al.

Keywords: digital forensics, digital heritage, gothic architecture, Houdini, procedural modelling

Procedia PDF Downloads 131
1 Synthetic Method of Contextual Knowledge Extraction

Authors: Olga Kononova, Sergey Lyapin

Abstract:

Global information society requirements are transparency and reliability of data, as well as ability to manage information resources independently; particularly to search, to analyze, to evaluate information, thereby obtaining new expertise. Moreover, it is satisfying the society information needs that increases the efficiency of the enterprise management and public administration. The study of structurally organized thematic and semantic contexts of different types, automatically extracted from unstructured data, is one of the important tasks for the application of information technologies in education, science, culture, governance and business. The objectives of this study are the contextual knowledge typologization, selection or creation of effective tools for extracting and analyzing contextual knowledge. Explication of various kinds and forms of the contextual knowledge involves the development and use full-text search information systems. For the implementation purposes, the authors use an e-library 'Humanitariana' services such as the contextual search, different types of queries (paragraph-oriented query, frequency-ranked query), automatic extraction of knowledge from the scientific texts. The multifunctional e-library «Humanitariana» is realized in the Internet-architecture in WWS-configuration (Web-browser / Web-server / SQL-server). Advantage of use 'Humanitariana' is in the possibility of combining the resources of several organizations. Scholars and research groups may work in a local network mode and in distributed IT environments with ability to appeal to resources of any participating organizations servers. Paper discusses some specific cases of the contextual knowledge explication with the use of the e-library services and focuses on possibilities of new types of the contextual knowledge. Experimental research base are science texts about 'e-government' and 'computer games'. An analysis of the subject-themed texts trends allowed to propose the content analysis methodology, that combines a full-text search with automatic construction of 'terminogramma' and expert analysis of the selected contexts. 'Terminogramma' is made out as a table that contains a column with a frequency-ranked list of words (nouns), as well as columns with an indication of the absolute frequency (number) and the relative frequency of occurrence of the word (in %% ppm). The analysis of 'e-government' materials showed, that the state takes a dominant position in the processes of the electronic interaction between the authorities and society in modern Russia. The media credited the main role in these processes to the government, which provided public services through specialized portals. Factor analysis revealed two factors statistically describing the used terms: human interaction (the user) and the state (government, processes organizer); interaction management (public officer, processes performer) and technology (infrastructure). Isolation of these factors will lead to changes in the model of electronic interaction between government and society. In this study, the dominant social problems and the prevalence of different categories of subjects of computer gaming in science papers from 2005 to 2015 were identified. Therefore, there is an evident identification of several types of contextual knowledge: micro context; macro context; dynamic context; thematic collection of queries (interactive contextual knowledge expanding a composition of e-library information resources); multimodal context (functional integration of iconographic and full-text resources through hybrid quasi-semantic algorithm of search). Further studies can be pursued both in terms of expanding the resource base on which they are held, and in terms of the development of appropriate tools.

Keywords: contextual knowledge, contextual search, e-library services, frequency-ranked query, paragraph-oriented query, technologies of the contextual knowledge extraction

Procedia PDF Downloads 359