Search results for: thermomechanical processing
2614 Screening of Antagonistic/Synergistic Effect between Lactic Acid Bacteria (LAB) and Yeast Strains Isolated from Kefir
Authors: Mihriban Korukluoglu, Goksen Arik, Cagla Erdogan, Selen Kocakoglu
Abstract:
Kefir is a traditional fermented refreshing beverage which is known for its valuable and beneficial properties for human health. Mainly yeast species, lactic acid bacteria (LAB) strains and fewer acetic acid bacteria strains live together in a natural matrix named “kefir grain”, which is formed from various proteins and polysaccharides. Different microbial species live together in slimy kefir grain and it has been thought that synergetic effect could take place between microorganisms, which belong to different genera and species. In this research, yeast and LAB were isolated from kefir samples obtained from Uludag University Food Engineering Department. The cell morphology of isolates was screened by microscopic examination. Gram reactions of bacteria isolates were determined by Gram staining method, and as well catalase activity was examined. After observing the microscopic/morphological and physical, enzymatic properties of all isolates, they were divided into the groups as LAB and/or yeast according to their physicochemical responses to the applied examinations. As part of this research, the antagonistic/synergistic efficacy of the identified five LAB and five yeast strains to each other were determined individually by disk diffusion method. The antagonistic or synergistic effect is one of the most important properties in a co-culture system that different microorganisms are living together. The synergistic effect should be promoted, whereas the antagonistic effect is prevented to provide effective culture for fermentation of kefir. The aim of this study was to determine microbial interactions between identified yeast and LAB strains, and whether their effect is antagonistic or synergistic. Thus, if there is a strain which inhibits or retards the growth of other strains found in Kefir microflora, this circumstance shows the presence of antagonistic effect in the medium. Such negative influence should be prevented, whereas the microorganisms which have synergistic effect on each other should be promoted by combining them in kefir grain. Standardisation is the most desired property for industrial production. Each microorganism found in the microbial flora of a kefir grain should be identified individually. The members of the microbial community found in the glue-like kefir grain may be redesigned as a starter culture regarding efficacy of each microorganism to another in kefir processing. The main aim of this research was to shed light on more effective production of kefir grain and to contribute a standardisation of kefir processing in the food industry.Keywords: antagonistic effect, kefir, lactic acid bacteria (LAB), synergistic, yeast
Procedia PDF Downloads 2802613 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA
Authors: Marek Dosbaba
Abstract:
Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data
Procedia PDF Downloads 1112612 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 1962611 A Method to Evaluate and Compare Web Information Extractors
Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman
Abstract:
Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.Keywords: web information extractors, information extraction evaluation method, Google scholar, web
Procedia PDF Downloads 2482610 Linguistic Analysis of Argumentation Structures in Georgian Political Speeches
Authors: Mariam Matiashvili
Abstract:
Argumentation is an integral part of our daily communications - formal or informal. Argumentative reasoning, techniques, and language tools are used both in personal conversations and in the business environment. Verbalization of the opinions requires the use of extraordinary syntactic-pragmatic structural quantities - arguments that add credibility to the statement. The study of argumentative structures allows us to identify the linguistic features that make the text argumentative. Knowing what elements make up an argumentative text in a particular language helps the users of that language improve their skills. Also, natural language processing (NLP) has become especially relevant recently. In this context, one of the main emphases is on the computational processing of argumentative texts, which will enable the automatic recognition and analysis of large volumes of textual data. The research deals with the linguistic analysis of the argumentative structures of Georgian political speeches - particularly the linguistic structure, characteristics, and functions of the parts of the argumentative text - claims, support, and attack statements. The research aims to describe the linguistic cues that give the sentence a judgmental/controversial character and helps to identify reasoning parts of the argumentative text. The empirical data comes from the Georgian Political Corpus, particularly TV debates. Consequently, the texts are of a dialogical nature, representing a discussion between two or more people (most often between a journalist and a politician). The research uses the following approaches to identify and analyze the argumentative structures Lexical Classification & Analysis - Identify lexical items that are relevant in argumentative texts creating process - Creating the lexicon of argumentation (presents groups of words gathered from a semantic point of view); Grammatical Analysis and Classification - means grammatical analysis of the words and phrases identified based on the arguing lexicon. Argumentation Schemas - Describe and identify the Argumentation Schemes that are most likely used in Georgian Political Speeches. As a final step, we analyzed the relations between the above mentioned components. For example, If an identified argument scheme is “Argument from Analogy”, identified lexical items semantically express analogy too, and they are most likely adverbs in Georgian. As a result, we created the lexicon with the words that play a significant role in creating Georgian argumentative structures. Linguistic analysis has shown that verbs play a crucial role in creating argumentative structures.Keywords: georgian, argumentation schemas, argumentation structures, argumentation lexicon
Procedia PDF Downloads 742609 An Enhanced Support Vector Machine Based Approach for Sentiment Classification of Arabic Tweets of Different Dialects
Authors: Gehad S. Kaseb, Mona F. Ahmed
Abstract:
Arabic Sentiment Analysis (SA) is one of the most common research fields with many open areas. Few studies apply SA to Arabic dialects. This paper proposes different pre-processing steps and a modified methodology to improve the accuracy using normal Support Vector Machine (SVM) classification. The paper works on two datasets, Arabic Sentiment Tweets Dataset (ASTD) and Extended Arabic Tweets Sentiment Dataset (Extended-AATSD), which are publicly available for academic use. The results show that the classification accuracy approaches 86%.Keywords: Arabic, classification, sentiment analysis, tweets
Procedia PDF Downloads 1492608 Spark Plasma Sintering/Synthesis of Alumina-Graphene Composites
Authors: Nikoloz Jalabadze, Roin Chedia, Lili Nadaraia, Levan Khundadze
Abstract:
Nanocrystalline materials in powder condition can be manufactured by a number of different methods, however manufacture of composite materials product in the same nanocrystalline state is still a problem because the processes of compaction and synthesis of nanocrystalline powders go with intensive growth of particles – the process which promotes formation of pieces in an ordinary crystalline state instead of being crystallized in the desirable nanocrystalline state. To date spark plasma sintering (SPS) has been considered as the most promising and energy efficient method for producing dense bodies of composite materials. An advantage of the SPS method in comparison with other methods is mainly low temperature and short time of the sintering procedure. That finally gives an opportunity to obtain dense material with nanocrystalline structure. Graphene has recently garnered significant interest as a reinforcing phase in composite materials because of its excellent electrical, thermal and mechanical properties. Graphene nanoplatelets (GNPs) in particular have attracted much interest as reinforcements for ceramic matrix composites (mostly in Al2O3, Si3N4, TiO2, ZrB2 a. c.). SPS has been shown to fully densify a variety of ceramic systems effectively including Al2O3 and often with improvements in mechanical and functional behavior. Alumina consolidated by SPS has been shown to have superior hardness, fracture toughness, plasticity and optical translucency compared to conventionally processed alumina. Knowledge of how GNPs influence sintering behavior is important to effectively process and manufacture process. In this study, the effects of GNPs on the SPS processing of Al2O3 are investigated by systematically varying sintering temperature, holding time and pressure. Our experiments showed that SPS process is also appropriate for the synthesis of nanocrystalline powders of alumina-graphene composites. Depending on the size of the molds, it is possible to obtain different amount of nanopowders. Investigation of the structure, physical-chemical, mechanical and performance properties of the elaborated composite materials was performed. The results of this study provide a fundamental understanding of the effects of GNP on sintering behavior, thereby providing a foundation for future optimization of the processing of these promising nanocomposite systems.Keywords: alumina oxide, ceramic matrix composites, graphene nanoplatelets, spark-plasma sintering
Procedia PDF Downloads 3772607 A Fast, Reliable Technique for Face Recognition Based on Hidden Markov Model
Authors: Sameh Abaza, Mohamed Ibrahim, Tarek Mahmoud
Abstract:
Due to the development in the digital image processing, its wide use in many applications such as medical, security, and others, the need for more accurate techniques that are reliable, fast and robust is vehemently demanded. In the field of security, in particular, speed is of the essence. In this paper, a pattern recognition technique that is based on the use of Hidden Markov Model (HMM), K-means and the Sobel operator method is developed. The proposed technique is proved to be fast with respect to some other techniques that are investigated for comparison. Moreover, it shows its capability of recognizing the normal face (center part) as well as face boundary.Keywords: HMM, K-Means, Sobel, accuracy, face recognition
Procedia PDF Downloads 3342606 Processing Mild versus Strong Violations in Music: A Pilot Study Using Event-Related Potentials
Authors: Marie-Eve Joret, Marijn Van Vliet, Flavio Camarrone, Marc M. Van Hulle
Abstract:
Event-related potentials (ERPs) provide evidence that the human brain can process and understand music at a pre-attentive level. Music-specific ERPs include the Early Right Anterior Negativity (ERAN) and a late Negativity (N5). This study aims to further investigate this issue using two types of syntactic manipulations in music: mild violations, containing no out-of-key tones and strong violations, containing out-of-key tones. We will examine whether both manipulations will elicit the same ERPs.Keywords: ERAN ERPs, Music, N5, P3, ERPs, Music, N5 component, P3 component
Procedia PDF Downloads 2762605 Stems of Prunus avium: An Unexplored By-product with Great Bioactive Potential
Authors: Luís R. Silva, Fábio Jesus, Catarina Bento, Ana C. Gonçalves
Abstract:
Over the last few years, the traditional medicine has gained ground at nutritional and pharmacological level. The natural products and their derivatives have great importance in several drugs used in modern therapeutics. Plant-based systems continue to play an essential role in primary healthcare. Additionally, the utilization of their plant parts, such as leaves, stems and flowers as nutraceutical and pharmaceutical products, can add a high value in the natural products market, not just by the nutritional value due to the significant levels of phytochemicals, but also by to the high benefit for the producers and manufacturers business. Stems of Prunus avium L. are a byproduct resulting from the processing of cherry, and have been consumed over the years as infusions and decoctions due to its bioactive properties, being used as sedative, diuretic and draining, to relief of renal stones, edema and hypertension. In this work, we prepared a hydroethanolic and infusion extracts from stems of P. avium collected in Fundão Region (Portugal), and evaluate the phenolic profile by LC/DAD, antioxidant capacity, α-glucosidase inhibitory activity and protection of human erythrocytes against oxidative damage. The LC-DAD analysis allowed to the identification of 19 phenolic compounds, catechin and 3-O-caffolquinic acid were the main ones. In a general way, hydroethanolic extract proved to be more active than infusion. This extract had the best antioxidant activity against DPPH• (IC50=22.37 ± 0.28 µg/mL) and superoxide radical (IC50=13.93 ± 0.30 µg/mL). Furthermore, it was the most active concerning inhibition of hemoglobin oxidation (IC50=13.73 ± 0.67 µg/mL), hemolysis (IC50=1.49 ± 0.18 µg/mL) and lipid peroxidation (IC50=26.20 ± 0.38 µg/mL) on human erythrocytes. On the other hand, infusion revealed to be more efficient towards α-glucosidase inhibitory activity (IC50=3.18 ± 0.23 µg/mL) and against nitric oxide radical (IC50=99.99 ± 1.89 µg/mL). The Sweet cherry sector is very important in Fundão Region (Portugal), and taking profit from the great wastes produced during processing of the cherry to produce added-value products, such as food supplements cannot be ignored. Our results demonstrate that P. avium stems possesses remarkable antioxidant and free radical scavenging properties. It is therefore, suggest, that P. avium stems can be used as a natural antioxidant with high potential to prevent or slow the progress of human diseases mediated by oxidative stress.Keywords: stems, Prunus avium, phenolic compounds, biological potential
Procedia PDF Downloads 2982604 A Metaheuristic for the Layout and Scheduling Problem in a Job Shop Environment
Authors: Hernández Eva Selene, Reyna Mary Carmen, Rivera Héctor, Barragán Irving
Abstract:
We propose an approach that jointly addresses the layout of a facility and the scheduling of a sequence of jobs. In real production, these two problems are interrelated. However, they are treated separately in the literature. Our approach is an extension of the job shop problem with transportation delay, where the location of the machines is selected among possible sites. The model minimizes the makespan, using the short processing times rule with two algorithms; the first one considers all the permutations for the location of machines, and the second only a heuristic to select some specific permutations that reduces computational time. Some instances are proved and compared with literature.Keywords: layout problem, job shop scheduling problem, concurrent scheduling and layout problem, metaheuristic
Procedia PDF Downloads 6102603 Modeling False Statements in Texts
Authors: Francielle A. Vargas, Thiago A. S. Pardo
Abstract:
According to the standard philosophical definition, lying is saying something that you believe to be false with the intent to deceive. For deception detection, the FBI trains its agents in a technique named statement analysis, which attempts to detect deception based on parts of speech (i.e., linguistics style). This method is employed in interrogations, where the suspects are first asked to make a written statement. In this poster, we model false statements using linguistics style. In order to achieve this, we methodically analyze linguistic features in a corpus of fake news in the Portuguese language. The results show that they present substantial lexical, syntactic and semantic variations, as well as punctuation and emotion distinctions.Keywords: deception detection, linguistics style, computational linguistics, natural language processing
Procedia PDF Downloads 2182602 Investigative Study of Consumer Perceptions to the Quality and Safety Attributes of 'Fresh' versus 'Frozen' Cassava (Manihot esculenta Crantz): A Case for Agro-Processing in Trinidad and Tobago, West Indies
Authors: Nadia Miranda Lorick, Neela Badrie, Marsha Singh
Abstract:
Cassava (Manihot esculenta, Crantz) which is also known as ‘yucca’ or ‘manioc’ has been acknowledged as a millennium crop which has been utilized for food security purposes. The crop provides considerable amount of energy. The aim of the study was to assess consumer groups of both ‘fresh’ and ‘frozen’ in terms of their perceptions toward the quality and safety attributes of frozen cassava. The questionnaire included four sections: consumer demographics, consumer perceptions on quality attributes of ‘frozen’ cassava, consumer knowledge, awareness and attitudes toward food safety of ‘frozen’ cassava and consumer suggestions toward the improvement of frozen cassava. A face-to-face questionnaire was administered to 200 consumers of cassava between April and May 2016. The criteria for inclusion in the survey were that they must be 15 years and over and consumer of cassava. The sections of the questionnaire included demographics of respondents, consumer perception on quality and safety attributes of cassava and suggestions for the improvement of the value-added product. The data was analysed by descriptive and chi-square using SPSS as well as qualitative information was captured. Only 17% of respondents purchased frozen cassava and this was significantly (P<0.05) associated to income. Some (15%) of fresh cassava purchasers had never heard of frozen cassava products and 7.5% o perceived that these products were unhealthy for consumption. More than half (51.3%) of the consumers (all from the ‘fresh’ cassava group) believed that there were ‘no toxins’ within cassava. The ‘frozen’ cassava products were valued for convenience but purchasers were least satisfied with ‘value for money’ (50%), ‘product safety’ (50%) and ‘colour’ (52.9%). Cassava purchasers demonstrated highest dissatisfaction levels with the quality attribute: value for money (6.6%, 11.8%) respectively. The most predominant area outlined by respondents for frozen cassava improvement was promotion /advertising/education (23%). The ‘frozen’ cassava purchasers were ‘least satisfied’ thus most concern that clean knives and clean surface would not be used agro- processing. Fresh cassava purchasers were comparatively more knowledgeable on the potential existence of naturally occurring toxins in cassava, however with 1% respondents being able to specifically identify the toxin as ‘cyanide’. Dangerous preservatives (31%), poor hygiene (30%) and chemicals from the packaging (11%) were identified as some sources of contamination of ‘frozen’ cassava. Purchasers of frozen cassava indicated that the information on packaging label was unclear (P<0.01) when compared to ‘fresh’ cassava consumers.Keywords: consumer satisfaction, convenience, cyanide toxin, product safety, price, label
Procedia PDF Downloads 4062601 A Sub-Scalar Approach to the MIPS Architecture
Authors: Kumar Sambhav Pandey, Anamika Singh
Abstract:
The continuous researches in the field of computer architecture basically aims at accelerating the computational speed and to gain enhanced performance. In this era, the superscalar, sub-scalar concept has not gained enough attention for improving the computation performance. In this paper, we have presented a sub-scalar approach to utilize the parallelism present with in the data while processing. The main idea is to split the data into individual smaller entities and these entities are processed with a defined known set of instructions. This sub-scalar approach to the MIPS architecture can bring out significant improvement in the computational speedup. MIPS-I is the basic design taken in consideration for the development of sub-scalar MIPS64 for increasing the instruction level parallelism (ILP) and resource utilization.Keywords: dataword, MIPS, processor, sub-scalar
Procedia PDF Downloads 5482600 An Efficient Clustering Technique for Copy-Paste Attack Detection
Authors: N. Chaitawittanun, M. Munlin
Abstract:
Due to rapid advancement of powerful image processing software, digital images are easy to manipulate and modify by ordinary people. Lots of digital images are edited for a specific purpose and more difficult to distinguish form their original ones. We propose a clustering method to detect a copy-move image forgery of JPEG, BMP, TIFF, and PNG. The process starts with reducing the color of the photos. Then, we use the clustering technique to divide information of measuring data by Hausdorff Distance. The result shows that the purposed methods is capable of inspecting the image file and correctly identify the forgery.Keywords: image detection, forgery image, copy-paste, attack detection
Procedia PDF Downloads 3382599 Path-Spin to Spin-Spin Hybrid Quantum Entanglement: A Conversion Protocol
Authors: Indranil Bayal, Pradipta Panchadhyayee
Abstract:
Path-spin hybrid entanglement generated and confined in a single spin-1/2 particle is converted to spin-spin hybrid interparticle entanglement, which finds its important applications in quantum information processing. This protocol uses beam splitter, spin flipper, spin measurement, classical channel, unitary transformations, etc., and requires no collective operation on the pair of particles whose spin variables share complete entanglement after the accomplishment of the protocol. The specialty of the protocol lies in the fact that the path-spin entanglement is transferred between spin degrees of freedom of two separate particles initially possessed by a single party.Keywords: entanglement, path-spin entanglement, spin-spin entanglement, CNOT operation
Procedia PDF Downloads 1992598 Detection Characteristics of the Random and Deterministic Signals in Antenna Arrays
Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev
Abstract:
In this paper approach to incoherent signal detection in multi-element antenna array are researched and modeled. Two types of useful signals with unknown wavefront were considered. First one is deterministic (Barker code), the second one is random (Gaussian distribution). The derivation of the sufficient statistics took into account the linearity of the antenna array. The performance characteristics and detecting curves are modeled and compared for different useful signals parameters and for different number of elements of the antenna array. Results of researches in case of some additional conditions can be applied to a digital communications systems.Keywords: antenna array, detection curves, performance characteristics, quadrature processing, signal detection
Procedia PDF Downloads 4062597 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria
Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe
Abstract:
Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.Keywords: data portal, data infrastructure, open source, sustainability
Procedia PDF Downloads 992596 Grid Pattern Recognition and Suppression in Computed Radiographic Images
Authors: Igor Belykh
Abstract:
Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.Keywords: grid, computed radiography, pattern recognition, image processing, filtering
Procedia PDF Downloads 2832595 Fermentation of Tolypocladium inflatum to Produce Cyclosporin in Dairy Waste Culture Medium
Authors: Fereshteh Falah, Alireza Vasiee, Farideh Tabatabaei-Yazdi
Abstract:
In this research, we investigated the usage of dairy sludge in the fermentation process and cyclosporin production. This bioactive compound is a metabolite produced by Tolypocladium inflatum. Results showed that about 200 ppm of cyclosporin can be produced in this fermentation. In order to have a proper and specific function, CyA must be free of any impurities, so we need purification. In this downstream processing, we used chromatographic extraction and evaluation of pharmacological activities of cyA. Results showed that the obtained metabolite has very high activity against Aspergilus niger (25mm clear zone). This cyclosporin was isolated for use as an antibiotic. The current research shows that this drug is very vital and commercially very important.Keywords: fermentation, cyclosporin A, Tolypocladium inflatum, TLC
Procedia PDF Downloads 1292594 The Composition and Activity of Germinated Broccoli Seeds and Their Extract
Authors: Boris Nemzer, Tania Reyes-Izquierdo, Zbigniew Pietrzkowski
Abstract:
Glucosinolate is a family of glucosides that can be found in a family of brassica vegetables. Upon the damage of the plant, glucosinolate breakdown by an internal enzyme myrosinase (thioglucosidase; EC 3.2.3.1) into isothiocyanates, such as sulforaphane. Sulforaphane is formed by glucoraphanin cleaving the sugar off by myrosinase and rearranged. Sulforaphane nitrile is formed in the same reaction as sulforaphane with the active of epithiospecifier protein (ESP). Most common food processing procedure would break the plant and mix the glucoraphanin and myrosinase together, and the formed sulforaphane would be further degraded. The purpose of this study is to understand the glucoraphanin/sulforaphane and the myrosinase activity of broccoli seeds germinated at a different time and technological processing conditions that keep the activity of the enzyme to form sulforaphane. Broccoli seeds were germinated in the house. Myrosinase activities were tested as the glucose content using glucose assay kit and measured UV-Vis spectrophotometer. Glucosinolates were measured by HPLC/DAD. Sulforaphane was measured using HPLC-DAD and GC/MS. The 6 hr germinated sprouts have a myrosinase activity 32.2 mg glucose/g, which is comparable with 12 and 24 hour germinated seeds and higher than dry seeds. The glucoraphanin content in 6 hour germinated sprouts is 13935 µg/g which is comparable to 24 hour germinated seeds and lower than the dry seeds. GC/MS results show that the amount of sulforaphane is higher than the amount of sulforaphane nitrile in seeds, 6 hour and 24 hour germinated seeds. The ratio of sulforaphane and sulforaphane nitrile is high in 6 hour germinated seeds, which indicates the inactivated ESP in the reaction. After evaluating the results, the short time germinated seeds can be used as the source of glucoraphanin and myrosinase supply to form potential higher sulforaphane content. Broccoli contains glucosinolates, glucoraphanin (4-methylsulfinylbutyl glucosinolate), which is an important metabolite with health-promoting effects. In the pilot clinical study, we observed the effects of a glucosinolates/glucoraphanin-rich extract from short time germinated broccoli seeds on blood adenosine triphosphate (ATP), reactive oxygen species (ROS) and lactate levels. A single dose of 50 mg of broccoli sprouts extract increased blood levels of ATP up to 61% (p=0.0092) during the first 2 hours after the ingestion. Interestingly, this effect was not associated with an increase in blood ROS or lactate. When compared to the placebo group, levels of lactate were reduced by 10% (p=0.006). These results indicate that broccoli germinated seed extract may positively affect the generation of ATP in humans. Due to the preliminary nature of this work and promising results, larger clinical trials are justified.Keywords: broccoli glucosinolates, glucoraphanin, germinated seeds, myrosinase, adenosine triphosphate
Procedia PDF Downloads 2912593 Structured-Ness and Contextual Retrieval Underlie Language Comprehension
Authors: Yao-Ying Lai, Maria Pinango, Ashwini Deo
Abstract:
While grammatical devices are essential to language processing, how comprehension utilizes cognitive mechanisms is less emphasized. This study addresses this issue by probing the complement coercion phenomenon: an entity-denoting complement following verbs like begin and finish receives an eventive interpretation. For example, (1) “The queen began the book” receives an agentive reading like (2) “The queen began [reading/writing/etc.…] the book.” Such sentences engender additional processing cost in real-time comprehension. The traditional account attributes this cost to an operation that coerces the entity-denoting complement to an event, assuming that these verbs require eventive complements. However, in closer examination, examples like “Chapter 1 began the book” undermine this assumption. An alternative, Structured Individual (SI) hypothesis, proposes that the complement following aspectual verbs (AspV; e.g. begin, finish) is conceptualized as a structured individual, construed as an axis along various dimensions (e.g. spatial, eventive, temporal, informational). The composition of an animate subject and an AspV such as (1) engenders an ambiguity between an agentive reading along the eventive dimension like (2), and a constitutive reading along the informational/spatial dimension like (3) “[The story of the queen] began the book,” in which the subject is interpreted as a subpart of the complement denotation. Comprehenders need to resolve the ambiguity by searching contextual information, resulting in additional cost. To evaluate the SI hypothesis, a questionnaire was employed. Method: Target AspV sentences such as “Shakespeare began the volume.” were preceded by one of the following types of context sentence: (A) Agentive-biasing, in which an event was mentioned (…writers often read…), (C) Constitutive-biasing, in which a constitutive meaning was hinted (Larry owns collections of Renaissance literature.), (N) Neutral context, which allowed both interpretations. Thirty-nine native speakers of English were asked to (i) rate each context-target sentence pair from a 1~5 scale (5=fully understandable), and (ii) choose possible interpretations for the target sentence given the context. The SI hypothesis predicts that comprehension is harder for the Neutral condition, as compared to the biasing conditions because no contextual information is provided to resolve an ambiguity. Also, comprehenders should obtain the specific interpretation corresponding to the context type. Results: (A) Agentive-biasing and (C) Constitutive-biasing were rated higher than (N) Neutral conditions (p< .001), while all conditions were within the acceptable range (> 3.5 on the 1~5 scale). This suggests that when lacking relevant contextual information, semantic ambiguity decreases comprehensibility. The interpretation task shows that the participants selected the biased agentive/constitutive reading for condition (A) and (C) respectively. For the Neutral condition, the agentive and constitutive readings were chosen equally often. Conclusion: These findings support the SI hypothesis: the meaning of AspV sentences is conceptualized as a parthood relation involving structured individuals. We argue that semantic representation makes reference to spatial structured-ness (abstracted axis). To obtain an appropriate interpretation, comprehenders utilize contextual information to enrich the conceptual representation of the sentence in question. This study connects semantic structure to human’s conceptual structure, and provides a processing model that incorporates contextual retrieval.Keywords: ambiguity resolution, contextual retrieval, spatial structured-ness, structured individual
Procedia PDF Downloads 3332592 Comparison of Processing Conditions for Plasticized PVC and PVB
Authors: Michael Tupý, Jaroslav Císař, Pavel Mokrejš, Dagmar Měřínská, Alice Tesaříková-Svobodová
Abstract:
The worldwide problem is that the recycled PVB is wildly stored in landfills. However, PVB have very similar chemical properties such as PVC. Moreover, both of them are used in plasticized form. Thus, the thermal properties of plasticized PVC obtained from primary production and the PVB was obtained by recycling of windshields are compared. It is carried out in order to find degradable conditions and decide if blend of PVB/PVC can be processable together. Tested PVC contained 38 % of plasticizer diisononyl phthalate (DINP) and PVB was plasticized with 28 % of triethylene glycol, bis(2-ethylhexanoate) (3GO). Thermal and thermo-oxidative decomposition of both vinyl polymers are compared such as DSC and OOT analysis. The tensile strength analysis is added.Keywords: polyvinyl chloride, polyvinyl butyral, recycling, reprocessing, thermal analysis, decomposition
Procedia PDF Downloads 5192591 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method
Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek
Abstract:
Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow
Procedia PDF Downloads 1352590 Implementation of Iterative Algorithm for Earthquake Location
Authors: Hussain K. Chaiel
Abstract:
The development in the field of the digital signal processing (DSP) and the microelectronics technology reduces the complexity of the iterative algorithms that need large number of arithmetic operations. Virtex-Field Programmable Gate Arrays (FPGAs) are programmable silicon foundations which offer an important solution for addressing the needs of high performance DSP designer. In this work, Virtex-7 FPGA technology is used to implement an iterative algorithm to estimate the earthquake location. Simulation results show that an implementation based on block RAMB36E1 and DSP48E1 slices of Virtex-7 type reduces the number of cycles of the clock frequency. This enables the algorithm to be used for earthquake prediction.Keywords: DSP, earthquake, FPGA, iterative algorithm
Procedia PDF Downloads 3892589 Oleic Acid Enhances Hippocampal Synaptic Efficacy
Authors: Rema Vazhappilly, Tapas Das
Abstract:
Oleic acid is a cis unsaturated fatty acid and is known to be a partially essential fatty acid due to its limited endogenous synthesis during pregnancy and lactation. Previous studies have demonstrated the role of oleic acid in neuronal differentiation and brain phospholipid synthesis. These evidences indicate a major role for oleic acid in learning and memory. Interestingly, oleic acid has been shown to enhance hippocampal long term potentiation (LTP), the physiological correlate of long term synaptic plasticity. However the effect of oleic acid on short term synaptic plasticity has not been investigated. Short term potentiation (STP) is the physiological correlate of short term synaptic plasticity which is the key underlying molecular mechanism of short term memory and neuronal information processing. STP in the hippocampal CA1 region has been known to require the activation of N-methyl-D-aspartate receptors (NMDARs). The NMDAR dependent hippocampal STP as a potential mechanism for short term memory has been a subject of intense interest for the past few years. Therefore in the present study the effect of oleic acid on NMDAR dependent hippocampal STP was determined in mouse hippocampal slices (in vitro) using Multi-electrode array system. STP was induced by weak tetanic Stimulation (one train of 100 Hz stimulations for 0.1s) of the Schaffer collaterals of CA1 region of the hippocampus in slices treated with different concentrations of oleic acid in presence or absence of NMDAR antagonist D-AP5 (30 µM) . Oleic acid at 20 (mean increase in fEPSP amplitude = ~135 % Vs. Control = 100%; P<0.001) and 30 µM (mean increase in fEPSP amplitude = ~ 280% Vs. Control = 100%); P<0.001) significantly enhanced the STP following weak tetanic stimulation. Lower oleic acid concentrations at 10 µM did not modify the hippocampal STP induced by weak tetanic stimulation. The hippocampal STP induced by weak tetanic stimulation was completely blocked by the NMDA receptor antagonist D-AP5 (30µM) in both oleic acid and control treated hippocampal slices. This lead to the conclusion that the hippocampal STP elicited by weak tetanic stimulation and enhanced by oleic acid was NMDAR dependent. Together these findings suggest that oleic acid may enhance the short term memory and neuronal information processing through the modulation of NMDAR dependent hippocampal short-term synaptic plasticity. In conclusion this study suggests the possible role of oleic acid to prevent the short term memory loss and impaired neuronal function throughout development.Keywords: oleic acid, short-term potentiation, memory, field excitatory post synaptic potentials, NMDA receptor
Procedia PDF Downloads 3362588 Agenesis of the Corpus Callosum: The Role of Neuropsychological Assessment with Implications to Psychosocial Rehabilitation
Authors: Ron Dick, P. S. D. V. Prasadarao, Glenn Coltman
Abstract:
Agenesis of the corpus callosum (ACC) is a failure to develop corpus callosum - the large bundle of fibers of the brain that connects the two cerebral hemispheres. It can occur as a partial or complete absence of the corpus callosum. In the general population, its estimated prevalence rate is 1 in 4000 and a wide range of genetic, infectious, vascular, and toxic causes have been attributed to this heterogeneous condition. The diagnosis of ACC is often achieved by neuroimaging procedures. Though persons with ACC can perform normally on intelligence tests they generally present with a range of neuropsychological and social deficits. The deficit profile is characterized by poor coordination of motor movements, slow reaction time, processing speed and, poor memory. Socially, they present with deficits in communication, language processing, the theory of mind, and interpersonal relationships. The present paper illustrates the role of neuropsychological assessment with implications to psychosocial management in a case of agenesis of the corpus callosum. Method: A 27-year old left handed Caucasian male with a history of ACC was self-referred for a neuropsychological assessment to assist him in his employment options. Parents noted significant difficulties with coordination and balance at an early age of 2-3 years and he was diagnosed with dyspraxia at the age of 14 years. History also indicated visual impairment, hypotonia, poor muscle coordination, and delayed development of motor milestones. MRI scan indicated agenesis of the corpus callosum with ventricular morphology, widely spaced parallel lateral ventricles and mild dilatation of the posterior horns; it also showed colpocephaly—a disproportionate enlargement of the occipital horns of the lateral ventricles which might be affecting his motor abilities and visual defects. The MRI scan ruled out other structural abnormalities or neonatal brain injury. At the time of assessment, the subject presented with such problems as poor coordination, slowed processing speed, poor organizational skills and time management, and difficulty with social cues and facial expressions. A comprehensive neuropsychological assessment was planned and conducted to assist in identifying the current neuropsychological profile to facilitate the formulation of a psychosocial and occupational rehabilitation programme. Results: General intellectual functioning was within the average range and his performance on memory-related tasks was adequate. Significant visuospatial and visuoconstructional deficits were evident across tests; constructional difficulties were seen in tasks such as copying a complex figure, building a tower and manipulating blocks. Poor visual scanning ability and visual motor speed were evident. Socially, the subject reported heightened social anxiety, difficulty in responding to cues in the social environment, and difficulty in developing intimate relationships. Conclusion: Persons with ACC are known to present with specific cognitive deficits and problems in social situations. Findings from the current neuropsychological assessment indicated significant visuospatial difficulties, poor visual scanning and problems in social interactions. His general intellectual functioning was within the average range. Based on the findings from the comprehensive neuropsychological assessment, a structured psychosocial rehabilitation programme was developed and recommended.Keywords: agenesis, callosum, corpus, neuropsychology, psychosocial, rehabilitation
Procedia PDF Downloads 2762587 Regulatory and Economic Challenges of AI Integration in Cyber Insurance
Authors: Shreyas Kumar, Mili Shangari
Abstract:
Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware
Procedia PDF Downloads 342586 Incidence of Fungal Infections and Mycotoxicosis in Pork Meat and Pork By-Products in Egyptian Markets
Authors: Ashraf Samir Hakim, Randa Mohamed Alarousy
Abstract:
The consumption of food contaminated with molds (microscopic filamentous fungi) and their toxic metabolites results in the development of food-borne mycotoxicosis. The spores of molds are ubiquitously spread in the environment and can be detected everywhere. Ochratoxin A is a potentially carcinogenic fungal toxin found in a variety of food commodities , not only is considered the most abundant and hence the most commonly detected member but also is the most toxic one.Ochratoxin A is the most abundant and hence the most commonly detected member, but is also the most toxic of the three. A very limited research works concerning foods of porcine origin in Egypt were obtained in spite of presence a considerable swine population and consumers. In this study, the quality of various ready-to-eat local and imported pork meat and meat byproducts sold in Egyptian markets as well as edible organs as liver and kidney were assessed for the presence of various molds and their toxins as a raw material. Mycological analysis was conducted on (n=110) samples which included pig livers n=10 and kidneys n=10 from the Basateen slaughter house; local n=70 and 20 imported processed pork meat byproducts.The isolates were identified using traditional mycological and biochemical tests while, Ochratoxin A levels were quantitatively analyzed using the high performance liquid. Results of conventional mycological tests for detecting the presence of fungal growth (yeasts or molds) were negative, while the results of mycotoxins concentrations were be greatly above the permiceable limits or "tolerable weekly intake" (TWI) of ochratoxin A established by EFSA in 2006 in local pork and pork byproducts while the imported samples showed a very slightly increasing.Since ochratoxin A is stable and generally resistant to heat and processing, control of ochratoxin A contamination lies in the control of the growth of the toxin-producing fungi. Effective prevention of ochratoxin A contamination therefore depends on good farming and agricultural practices. Good Agricultural Practices (GAP) including methods to reduce fungal infection and growth during harvest, storage, transport and processing provide the primary line of defense against contamination with ochratoxin A. To the best of our knowledge this is the first report of mycological assessment, especially the mycotoxins in pork byproducts in Egypt.Keywords: Egyptian markets, mycotoxicosis, ochratoxin A, pork meat, pork by-products
Procedia PDF Downloads 4662585 Denoising of Magnetotelluric Signals by Filtering
Authors: Rodrigo Montufar-Chaveznava, Fernando Brambila-Paz, Ivette Caldelas
Abstract:
In this paper, we present the advances corresponding to the denoising processing of magnetotelluric signals using several filters. In particular, we use the most common spatial domain filters such as median and mean, but we are also using the Fourier and wavelet transform for frequency domain filtering. We employ three datasets obtained at the different sampling rate (128, 4096 and 8192 bps) and evaluate the mean square error, signal-to-noise relation, and peak signal-to-noise relation to compare the kernels and determine the most suitable for each case. The magnetotelluric signals correspond to earth exploration when water is searched. The object is to find a denoising strategy different to the one included in the commercial equipment that is employed in this task.Keywords: denoising, filtering, magnetotelluric signals, wavelet transform
Procedia PDF Downloads 372