Search results for: processing map
2612 Studying the Effect of Nanoclays on the Mechanical Properties of Polypropylene/Polyamide Nanocomposites
Authors: Benalia Kouini, Aicha Serier
Abstract:
Nanocomposites based on polypropylene/polyamide 66 (PP/PA66) nanoblends containing organophilic montmorillonite (OMMT) and maleic anhydride grafted polypropylene (PP-g-MAH) were prepared by melt compounding method followed by injection molding. Two different types of nanoclays were used in this work. DELLITE LVF is the untreated nanoclay and DELLITE 67G is the treated one. The morphology of the nanocomposites was studied using the XR diffraction (XRD). The results indicate that the incorporation of treated nanoclay has a significant effect on the impact strength of PP/PA66 nanocomposites. Furthermore, it was found that XRD results revealed the intercalation, exfoliation of nanaclays of nanocomposites.Keywords: nNanoclay, Nanocomposites, Polypropylene, Polyamide, melt processing, mechanical properties.
Procedia PDF Downloads 3542611 Simultaneous Measurement of Wave Pressure and Wind Speed with the Specific Instrument and the Unit of Measurement Description
Authors: Branimir Jurun, Elza Jurun
Abstract:
The focus of this paper is the description of an instrument called 'Quattuor 45' and defining of wave pressure measurement. Special attention is given to measurement of wave pressure created by the wind speed increasing obtained with the instrument 'Quattuor 45' in the investigated area. The study begins with respect to theoretical attitudes and numerous up to date investigations related to the waves approaching the coast. The detailed schematic view of the instrument is enriched with pictures from ground plan and side view. Horizontal stability of the instrument is achieved by mooring which relies on two concrete blocks. Vertical wave peak monitoring is ensured by one float above the instrument. The synthesis of horizontal stability and vertical wave peak monitoring allows to create a representative database for wave pressure measuring. Instrument ‘Quattuor 45' is named according to the way the database is received. Namely, the electronic part of the instrument consists of the main chip ‘Arduino', its memory, four load cells with the appropriate modules and the wind speed sensor 'Anemometers'. The 'Arduino' chip is programmed to store two data from each load cell and two data from the anemometer on SD card each second. The next part of the research is dedicated to data processing. All measured results are stored automatically in the database and after that detailed processing is carried out in the MS Excel. The result of the wave pressure measurement is synthesized by the unit of measurement kN/m². This paper also suggests a graphical presentation of the results by multi-line graph. The wave pressure is presented on the left vertical axis, while the wind speed is shown on the right vertical axis. The time of measurement is displayed on the horizontal axis. The paper proposes an algorithm for wind speed measurements showing the results for two characteristic winds in the Adriatic Sea, called 'Bura' and 'Jugo'. The first of them is the northern wind that reaches high speeds, causing low and extremely steep waves, where the pressure of the wave is relatively weak. On the other hand, the southern wind 'Jugo' has a lower speed than the northern wind, but due to its constant duration and constant speed maintenance, it causes extremely long and high waves that cause extremely high wave pressure.Keywords: instrument, measuring unit, waves pressure metering, wind seed measurement
Procedia PDF Downloads 1982610 Underwater Image Enhancement and Reconstruction Using CNN and the MultiUNet Model
Authors: Snehal G. Teli, R. J. Shelke
Abstract:
CNN and MultiUNet models are the framework for the proposed method for enhancing and reconstructing underwater images. Multiscale merging of features and regeneration are both performed by the MultiUNet. CNN collects relevant features. Extensive tests on benchmark datasets show that the proposed strategy performs better than the latest methods. As a result of this work, underwater images can be represented and interpreted in a number of underwater applications with greater clarity. This strategy will advance underwater exploration and marine research by enhancing real-time underwater image processing systems, underwater robotic vision, and underwater surveillance.Keywords: convolutional neural network, image enhancement, machine learning, multiunet, underwater images
Procedia PDF Downloads 752609 Dyeing with Natural Dye from Pterocarpus indicus Extract Using Eco-Friendly Mordants
Authors: Ploysai Ohama, Nuttawadee Hanchengchai, Thiva Saksri
Abstract:
Natural dye extracted from Pterocarpus indicus was applied to a cotton fabric and silk yarn by dyeing processing different eco-friendly mordants. Analytical studies such as UV–VIS spectrophotometry and gravimetric analysis were performed on the extracts. The color of each dyed material was investigated in terms of the CIELAB (L*, a* and b*) and K/S values. Cotton fabric dyed without mordants had a shade of greenish-brown, while those post-mordanted with selected eco-friendly mordants such as alum, lemon juice and limewater result in a variety of brown and darker color shade of fabric.Keywords: natural dyes, plant materials, dyeing, mordant
Procedia PDF Downloads 4152608 A New Scheme for Chain Code Normalization in Arabic and Farsi Scripts
Authors: Reza Shakoori
Abstract:
This paper presents a structural correction of Arabic and Persian strokes using manipulation of their chain codes in order to improve the rate and performance of Persian and Arabic handwritten word recognition systems. It collects pure and effective features to represent a character with one consolidated feature vector and reduces variations in order to decrease the number of training samples and increase the chance of successful classification. Our results also show that how the proposed approaches can simplify classification and consequently recognition by reducing variations and possible noises on the chain code by keeping orientation of characters and their backbone structures.Keywords: Arabic, chain code normalization, OCR systems, image processing
Procedia PDF Downloads 4042607 Computational Tool for Surface Electromyography Analysis; an Easy Way for Non-Engineers
Authors: Fabiano Araujo Soares, Sauro Emerick Salomoni, Joao Paulo Lima da Silva, Igor Luiz Moura, Adson Ferreira da Rocha
Abstract:
This paper presents a tool developed in the Matlab platform. It was developed to simplify the analysis of surface electromyography signals (S-EMG) in a way accessible to users that are not familiarized with signal processing procedures. The tool receives data by commands in window fields and generates results as graphics and excel tables. The underlying math of each S-EMG estimator is presented. Setup window and result graphics are presented. The tool was presented to four non-engineer users and all of them managed to appropriately use it after a 5 minutes instruction period.Keywords: S-EMG estimators, electromyography, surface electromyography, ARV, RMS, MDF, MNF, CV
Procedia PDF Downloads 5592606 Screening of Antagonistic/Synergistic Effect between Lactic Acid Bacteria (LAB) and Yeast Strains Isolated from Kefir
Authors: Mihriban Korukluoglu, Goksen Arik, Cagla Erdogan, Selen Kocakoglu
Abstract:
Kefir is a traditional fermented refreshing beverage which is known for its valuable and beneficial properties for human health. Mainly yeast species, lactic acid bacteria (LAB) strains and fewer acetic acid bacteria strains live together in a natural matrix named “kefir grain”, which is formed from various proteins and polysaccharides. Different microbial species live together in slimy kefir grain and it has been thought that synergetic effect could take place between microorganisms, which belong to different genera and species. In this research, yeast and LAB were isolated from kefir samples obtained from Uludag University Food Engineering Department. The cell morphology of isolates was screened by microscopic examination. Gram reactions of bacteria isolates were determined by Gram staining method, and as well catalase activity was examined. After observing the microscopic/morphological and physical, enzymatic properties of all isolates, they were divided into the groups as LAB and/or yeast according to their physicochemical responses to the applied examinations. As part of this research, the antagonistic/synergistic efficacy of the identified five LAB and five yeast strains to each other were determined individually by disk diffusion method. The antagonistic or synergistic effect is one of the most important properties in a co-culture system that different microorganisms are living together. The synergistic effect should be promoted, whereas the antagonistic effect is prevented to provide effective culture for fermentation of kefir. The aim of this study was to determine microbial interactions between identified yeast and LAB strains, and whether their effect is antagonistic or synergistic. Thus, if there is a strain which inhibits or retards the growth of other strains found in Kefir microflora, this circumstance shows the presence of antagonistic effect in the medium. Such negative influence should be prevented, whereas the microorganisms which have synergistic effect on each other should be promoted by combining them in kefir grain. Standardisation is the most desired property for industrial production. Each microorganism found in the microbial flora of a kefir grain should be identified individually. The members of the microbial community found in the glue-like kefir grain may be redesigned as a starter culture regarding efficacy of each microorganism to another in kefir processing. The main aim of this research was to shed light on more effective production of kefir grain and to contribute a standardisation of kefir processing in the food industry.Keywords: antagonistic effect, kefir, lactic acid bacteria (LAB), synergistic, yeast
Procedia PDF Downloads 2802605 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA
Authors: Marek Dosbaba
Abstract:
Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data
Procedia PDF Downloads 1092604 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 1942603 A Method to Evaluate and Compare Web Information Extractors
Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman
Abstract:
Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.Keywords: web information extractors, information extraction evaluation method, Google scholar, web
Procedia PDF Downloads 2482602 Linguistic Analysis of Argumentation Structures in Georgian Political Speeches
Authors: Mariam Matiashvili
Abstract:
Argumentation is an integral part of our daily communications - formal or informal. Argumentative reasoning, techniques, and language tools are used both in personal conversations and in the business environment. Verbalization of the opinions requires the use of extraordinary syntactic-pragmatic structural quantities - arguments that add credibility to the statement. The study of argumentative structures allows us to identify the linguistic features that make the text argumentative. Knowing what elements make up an argumentative text in a particular language helps the users of that language improve their skills. Also, natural language processing (NLP) has become especially relevant recently. In this context, one of the main emphases is on the computational processing of argumentative texts, which will enable the automatic recognition and analysis of large volumes of textual data. The research deals with the linguistic analysis of the argumentative structures of Georgian political speeches - particularly the linguistic structure, characteristics, and functions of the parts of the argumentative text - claims, support, and attack statements. The research aims to describe the linguistic cues that give the sentence a judgmental/controversial character and helps to identify reasoning parts of the argumentative text. The empirical data comes from the Georgian Political Corpus, particularly TV debates. Consequently, the texts are of a dialogical nature, representing a discussion between two or more people (most often between a journalist and a politician). The research uses the following approaches to identify and analyze the argumentative structures Lexical Classification & Analysis - Identify lexical items that are relevant in argumentative texts creating process - Creating the lexicon of argumentation (presents groups of words gathered from a semantic point of view); Grammatical Analysis and Classification - means grammatical analysis of the words and phrases identified based on the arguing lexicon. Argumentation Schemas - Describe and identify the Argumentation Schemes that are most likely used in Georgian Political Speeches. As a final step, we analyzed the relations between the above mentioned components. For example, If an identified argument scheme is “Argument from Analogy”, identified lexical items semantically express analogy too, and they are most likely adverbs in Georgian. As a result, we created the lexicon with the words that play a significant role in creating Georgian argumentative structures. Linguistic analysis has shown that verbs play a crucial role in creating argumentative structures.Keywords: georgian, argumentation schemas, argumentation structures, argumentation lexicon
Procedia PDF Downloads 712601 An Enhanced Support Vector Machine Based Approach for Sentiment Classification of Arabic Tweets of Different Dialects
Authors: Gehad S. Kaseb, Mona F. Ahmed
Abstract:
Arabic Sentiment Analysis (SA) is one of the most common research fields with many open areas. Few studies apply SA to Arabic dialects. This paper proposes different pre-processing steps and a modified methodology to improve the accuracy using normal Support Vector Machine (SVM) classification. The paper works on two datasets, Arabic Sentiment Tweets Dataset (ASTD) and Extended Arabic Tweets Sentiment Dataset (Extended-AATSD), which are publicly available for academic use. The results show that the classification accuracy approaches 86%.Keywords: Arabic, classification, sentiment analysis, tweets
Procedia PDF Downloads 1492600 Spark Plasma Sintering/Synthesis of Alumina-Graphene Composites
Authors: Nikoloz Jalabadze, Roin Chedia, Lili Nadaraia, Levan Khundadze
Abstract:
Nanocrystalline materials in powder condition can be manufactured by a number of different methods, however manufacture of composite materials product in the same nanocrystalline state is still a problem because the processes of compaction and synthesis of nanocrystalline powders go with intensive growth of particles – the process which promotes formation of pieces in an ordinary crystalline state instead of being crystallized in the desirable nanocrystalline state. To date spark plasma sintering (SPS) has been considered as the most promising and energy efficient method for producing dense bodies of composite materials. An advantage of the SPS method in comparison with other methods is mainly low temperature and short time of the sintering procedure. That finally gives an opportunity to obtain dense material with nanocrystalline structure. Graphene has recently garnered significant interest as a reinforcing phase in composite materials because of its excellent electrical, thermal and mechanical properties. Graphene nanoplatelets (GNPs) in particular have attracted much interest as reinforcements for ceramic matrix composites (mostly in Al2O3, Si3N4, TiO2, ZrB2 a. c.). SPS has been shown to fully densify a variety of ceramic systems effectively including Al2O3 and often with improvements in mechanical and functional behavior. Alumina consolidated by SPS has been shown to have superior hardness, fracture toughness, plasticity and optical translucency compared to conventionally processed alumina. Knowledge of how GNPs influence sintering behavior is important to effectively process and manufacture process. In this study, the effects of GNPs on the SPS processing of Al2O3 are investigated by systematically varying sintering temperature, holding time and pressure. Our experiments showed that SPS process is also appropriate for the synthesis of nanocrystalline powders of alumina-graphene composites. Depending on the size of the molds, it is possible to obtain different amount of nanopowders. Investigation of the structure, physical-chemical, mechanical and performance properties of the elaborated composite materials was performed. The results of this study provide a fundamental understanding of the effects of GNP on sintering behavior, thereby providing a foundation for future optimization of the processing of these promising nanocomposite systems.Keywords: alumina oxide, ceramic matrix composites, graphene nanoplatelets, spark-plasma sintering
Procedia PDF Downloads 3762599 A Fast, Reliable Technique for Face Recognition Based on Hidden Markov Model
Authors: Sameh Abaza, Mohamed Ibrahim, Tarek Mahmoud
Abstract:
Due to the development in the digital image processing, its wide use in many applications such as medical, security, and others, the need for more accurate techniques that are reliable, fast and robust is vehemently demanded. In the field of security, in particular, speed is of the essence. In this paper, a pattern recognition technique that is based on the use of Hidden Markov Model (HMM), K-means and the Sobel operator method is developed. The proposed technique is proved to be fast with respect to some other techniques that are investigated for comparison. Moreover, it shows its capability of recognizing the normal face (center part) as well as face boundary.Keywords: HMM, K-Means, Sobel, accuracy, face recognition
Procedia PDF Downloads 3312598 Processing Mild versus Strong Violations in Music: A Pilot Study Using Event-Related Potentials
Authors: Marie-Eve Joret, Marijn Van Vliet, Flavio Camarrone, Marc M. Van Hulle
Abstract:
Event-related potentials (ERPs) provide evidence that the human brain can process and understand music at a pre-attentive level. Music-specific ERPs include the Early Right Anterior Negativity (ERAN) and a late Negativity (N5). This study aims to further investigate this issue using two types of syntactic manipulations in music: mild violations, containing no out-of-key tones and strong violations, containing out-of-key tones. We will examine whether both manipulations will elicit the same ERPs.Keywords: ERAN ERPs, Music, N5, P3, ERPs, Music, N5 component, P3 component
Procedia PDF Downloads 2752597 Stems of Prunus avium: An Unexplored By-product with Great Bioactive Potential
Authors: Luís R. Silva, Fábio Jesus, Catarina Bento, Ana C. Gonçalves
Abstract:
Over the last few years, the traditional medicine has gained ground at nutritional and pharmacological level. The natural products and their derivatives have great importance in several drugs used in modern therapeutics. Plant-based systems continue to play an essential role in primary healthcare. Additionally, the utilization of their plant parts, such as leaves, stems and flowers as nutraceutical and pharmaceutical products, can add a high value in the natural products market, not just by the nutritional value due to the significant levels of phytochemicals, but also by to the high benefit for the producers and manufacturers business. Stems of Prunus avium L. are a byproduct resulting from the processing of cherry, and have been consumed over the years as infusions and decoctions due to its bioactive properties, being used as sedative, diuretic and draining, to relief of renal stones, edema and hypertension. In this work, we prepared a hydroethanolic and infusion extracts from stems of P. avium collected in Fundão Region (Portugal), and evaluate the phenolic profile by LC/DAD, antioxidant capacity, α-glucosidase inhibitory activity and protection of human erythrocytes against oxidative damage. The LC-DAD analysis allowed to the identification of 19 phenolic compounds, catechin and 3-O-caffolquinic acid were the main ones. In a general way, hydroethanolic extract proved to be more active than infusion. This extract had the best antioxidant activity against DPPH• (IC50=22.37 ± 0.28 µg/mL) and superoxide radical (IC50=13.93 ± 0.30 µg/mL). Furthermore, it was the most active concerning inhibition of hemoglobin oxidation (IC50=13.73 ± 0.67 µg/mL), hemolysis (IC50=1.49 ± 0.18 µg/mL) and lipid peroxidation (IC50=26.20 ± 0.38 µg/mL) on human erythrocytes. On the other hand, infusion revealed to be more efficient towards α-glucosidase inhibitory activity (IC50=3.18 ± 0.23 µg/mL) and against nitric oxide radical (IC50=99.99 ± 1.89 µg/mL). The Sweet cherry sector is very important in Fundão Region (Portugal), and taking profit from the great wastes produced during processing of the cherry to produce added-value products, such as food supplements cannot be ignored. Our results demonstrate that P. avium stems possesses remarkable antioxidant and free radical scavenging properties. It is therefore, suggest, that P. avium stems can be used as a natural antioxidant with high potential to prevent or slow the progress of human diseases mediated by oxidative stress.Keywords: stems, Prunus avium, phenolic compounds, biological potential
Procedia PDF Downloads 2972596 A Metaheuristic for the Layout and Scheduling Problem in a Job Shop Environment
Authors: Hernández Eva Selene, Reyna Mary Carmen, Rivera Héctor, Barragán Irving
Abstract:
We propose an approach that jointly addresses the layout of a facility and the scheduling of a sequence of jobs. In real production, these two problems are interrelated. However, they are treated separately in the literature. Our approach is an extension of the job shop problem with transportation delay, where the location of the machines is selected among possible sites. The model minimizes the makespan, using the short processing times rule with two algorithms; the first one considers all the permutations for the location of machines, and the second only a heuristic to select some specific permutations that reduces computational time. Some instances are proved and compared with literature.Keywords: layout problem, job shop scheduling problem, concurrent scheduling and layout problem, metaheuristic
Procedia PDF Downloads 6062595 Modeling False Statements in Texts
Authors: Francielle A. Vargas, Thiago A. S. Pardo
Abstract:
According to the standard philosophical definition, lying is saying something that you believe to be false with the intent to deceive. For deception detection, the FBI trains its agents in a technique named statement analysis, which attempts to detect deception based on parts of speech (i.e., linguistics style). This method is employed in interrogations, where the suspects are first asked to make a written statement. In this poster, we model false statements using linguistics style. In order to achieve this, we methodically analyze linguistic features in a corpus of fake news in the Portuguese language. The results show that they present substantial lexical, syntactic and semantic variations, as well as punctuation and emotion distinctions.Keywords: deception detection, linguistics style, computational linguistics, natural language processing
Procedia PDF Downloads 2182594 Investigative Study of Consumer Perceptions to the Quality and Safety Attributes of 'Fresh' versus 'Frozen' Cassava (Manihot esculenta Crantz): A Case for Agro-Processing in Trinidad and Tobago, West Indies
Authors: Nadia Miranda Lorick, Neela Badrie, Marsha Singh
Abstract:
Cassava (Manihot esculenta, Crantz) which is also known as ‘yucca’ or ‘manioc’ has been acknowledged as a millennium crop which has been utilized for food security purposes. The crop provides considerable amount of energy. The aim of the study was to assess consumer groups of both ‘fresh’ and ‘frozen’ in terms of their perceptions toward the quality and safety attributes of frozen cassava. The questionnaire included four sections: consumer demographics, consumer perceptions on quality attributes of ‘frozen’ cassava, consumer knowledge, awareness and attitudes toward food safety of ‘frozen’ cassava and consumer suggestions toward the improvement of frozen cassava. A face-to-face questionnaire was administered to 200 consumers of cassava between April and May 2016. The criteria for inclusion in the survey were that they must be 15 years and over and consumer of cassava. The sections of the questionnaire included demographics of respondents, consumer perception on quality and safety attributes of cassava and suggestions for the improvement of the value-added product. The data was analysed by descriptive and chi-square using SPSS as well as qualitative information was captured. Only 17% of respondents purchased frozen cassava and this was significantly (P<0.05) associated to income. Some (15%) of fresh cassava purchasers had never heard of frozen cassava products and 7.5% o perceived that these products were unhealthy for consumption. More than half (51.3%) of the consumers (all from the ‘fresh’ cassava group) believed that there were ‘no toxins’ within cassava. The ‘frozen’ cassava products were valued for convenience but purchasers were least satisfied with ‘value for money’ (50%), ‘product safety’ (50%) and ‘colour’ (52.9%). Cassava purchasers demonstrated highest dissatisfaction levels with the quality attribute: value for money (6.6%, 11.8%) respectively. The most predominant area outlined by respondents for frozen cassava improvement was promotion /advertising/education (23%). The ‘frozen’ cassava purchasers were ‘least satisfied’ thus most concern that clean knives and clean surface would not be used agro- processing. Fresh cassava purchasers were comparatively more knowledgeable on the potential existence of naturally occurring toxins in cassava, however with 1% respondents being able to specifically identify the toxin as ‘cyanide’. Dangerous preservatives (31%), poor hygiene (30%) and chemicals from the packaging (11%) were identified as some sources of contamination of ‘frozen’ cassava. Purchasers of frozen cassava indicated that the information on packaging label was unclear (P<0.01) when compared to ‘fresh’ cassava consumers.Keywords: consumer satisfaction, convenience, cyanide toxin, product safety, price, label
Procedia PDF Downloads 4022593 A Sub-Scalar Approach to the MIPS Architecture
Authors: Kumar Sambhav Pandey, Anamika Singh
Abstract:
The continuous researches in the field of computer architecture basically aims at accelerating the computational speed and to gain enhanced performance. In this era, the superscalar, sub-scalar concept has not gained enough attention for improving the computation performance. In this paper, we have presented a sub-scalar approach to utilize the parallelism present with in the data while processing. The main idea is to split the data into individual smaller entities and these entities are processed with a defined known set of instructions. This sub-scalar approach to the MIPS architecture can bring out significant improvement in the computational speedup. MIPS-I is the basic design taken in consideration for the development of sub-scalar MIPS64 for increasing the instruction level parallelism (ILP) and resource utilization.Keywords: dataword, MIPS, processor, sub-scalar
Procedia PDF Downloads 5452592 An Efficient Clustering Technique for Copy-Paste Attack Detection
Authors: N. Chaitawittanun, M. Munlin
Abstract:
Due to rapid advancement of powerful image processing software, digital images are easy to manipulate and modify by ordinary people. Lots of digital images are edited for a specific purpose and more difficult to distinguish form their original ones. We propose a clustering method to detect a copy-move image forgery of JPEG, BMP, TIFF, and PNG. The process starts with reducing the color of the photos. Then, we use the clustering technique to divide information of measuring data by Hausdorff Distance. The result shows that the purposed methods is capable of inspecting the image file and correctly identify the forgery.Keywords: image detection, forgery image, copy-paste, attack detection
Procedia PDF Downloads 3382591 Path-Spin to Spin-Spin Hybrid Quantum Entanglement: A Conversion Protocol
Authors: Indranil Bayal, Pradipta Panchadhyayee
Abstract:
Path-spin hybrid entanglement generated and confined in a single spin-1/2 particle is converted to spin-spin hybrid interparticle entanglement, which finds its important applications in quantum information processing. This protocol uses beam splitter, spin flipper, spin measurement, classical channel, unitary transformations, etc., and requires no collective operation on the pair of particles whose spin variables share complete entanglement after the accomplishment of the protocol. The specialty of the protocol lies in the fact that the path-spin entanglement is transferred between spin degrees of freedom of two separate particles initially possessed by a single party.Keywords: entanglement, path-spin entanglement, spin-spin entanglement, CNOT operation
Procedia PDF Downloads 1982590 Detection Characteristics of the Random and Deterministic Signals in Antenna Arrays
Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev
Abstract:
In this paper approach to incoherent signal detection in multi-element antenna array are researched and modeled. Two types of useful signals with unknown wavefront were considered. First one is deterministic (Barker code), the second one is random (Gaussian distribution). The derivation of the sufficient statistics took into account the linearity of the antenna array. The performance characteristics and detecting curves are modeled and compared for different useful signals parameters and for different number of elements of the antenna array. Results of researches in case of some additional conditions can be applied to a digital communications systems.Keywords: antenna array, detection curves, performance characteristics, quadrature processing, signal detection
Procedia PDF Downloads 4052589 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria
Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe
Abstract:
Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.Keywords: data portal, data infrastructure, open source, sustainability
Procedia PDF Downloads 972588 Grid Pattern Recognition and Suppression in Computed Radiographic Images
Authors: Igor Belykh
Abstract:
Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.Keywords: grid, computed radiography, pattern recognition, image processing, filtering
Procedia PDF Downloads 2832587 Fermentation of Tolypocladium inflatum to Produce Cyclosporin in Dairy Waste Culture Medium
Authors: Fereshteh Falah, Alireza Vasiee, Farideh Tabatabaei-Yazdi
Abstract:
In this research, we investigated the usage of dairy sludge in the fermentation process and cyclosporin production. This bioactive compound is a metabolite produced by Tolypocladium inflatum. Results showed that about 200 ppm of cyclosporin can be produced in this fermentation. In order to have a proper and specific function, CyA must be free of any impurities, so we need purification. In this downstream processing, we used chromatographic extraction and evaluation of pharmacological activities of cyA. Results showed that the obtained metabolite has very high activity against Aspergilus niger (25mm clear zone). This cyclosporin was isolated for use as an antibiotic. The current research shows that this drug is very vital and commercially very important.Keywords: fermentation, cyclosporin A, Tolypocladium inflatum, TLC
Procedia PDF Downloads 1272586 The Composition and Activity of Germinated Broccoli Seeds and Their Extract
Authors: Boris Nemzer, Tania Reyes-Izquierdo, Zbigniew Pietrzkowski
Abstract:
Glucosinolate is a family of glucosides that can be found in a family of brassica vegetables. Upon the damage of the plant, glucosinolate breakdown by an internal enzyme myrosinase (thioglucosidase; EC 3.2.3.1) into isothiocyanates, such as sulforaphane. Sulforaphane is formed by glucoraphanin cleaving the sugar off by myrosinase and rearranged. Sulforaphane nitrile is formed in the same reaction as sulforaphane with the active of epithiospecifier protein (ESP). Most common food processing procedure would break the plant and mix the glucoraphanin and myrosinase together, and the formed sulforaphane would be further degraded. The purpose of this study is to understand the glucoraphanin/sulforaphane and the myrosinase activity of broccoli seeds germinated at a different time and technological processing conditions that keep the activity of the enzyme to form sulforaphane. Broccoli seeds were germinated in the house. Myrosinase activities were tested as the glucose content using glucose assay kit and measured UV-Vis spectrophotometer. Glucosinolates were measured by HPLC/DAD. Sulforaphane was measured using HPLC-DAD and GC/MS. The 6 hr germinated sprouts have a myrosinase activity 32.2 mg glucose/g, which is comparable with 12 and 24 hour germinated seeds and higher than dry seeds. The glucoraphanin content in 6 hour germinated sprouts is 13935 µg/g which is comparable to 24 hour germinated seeds and lower than the dry seeds. GC/MS results show that the amount of sulforaphane is higher than the amount of sulforaphane nitrile in seeds, 6 hour and 24 hour germinated seeds. The ratio of sulforaphane and sulforaphane nitrile is high in 6 hour germinated seeds, which indicates the inactivated ESP in the reaction. After evaluating the results, the short time germinated seeds can be used as the source of glucoraphanin and myrosinase supply to form potential higher sulforaphane content. Broccoli contains glucosinolates, glucoraphanin (4-methylsulfinylbutyl glucosinolate), which is an important metabolite with health-promoting effects. In the pilot clinical study, we observed the effects of a glucosinolates/glucoraphanin-rich extract from short time germinated broccoli seeds on blood adenosine triphosphate (ATP), reactive oxygen species (ROS) and lactate levels. A single dose of 50 mg of broccoli sprouts extract increased blood levels of ATP up to 61% (p=0.0092) during the first 2 hours after the ingestion. Interestingly, this effect was not associated with an increase in blood ROS or lactate. When compared to the placebo group, levels of lactate were reduced by 10% (p=0.006). These results indicate that broccoli germinated seed extract may positively affect the generation of ATP in humans. Due to the preliminary nature of this work and promising results, larger clinical trials are justified.Keywords: broccoli glucosinolates, glucoraphanin, germinated seeds, myrosinase, adenosine triphosphate
Procedia PDF Downloads 2902585 Structured-Ness and Contextual Retrieval Underlie Language Comprehension
Authors: Yao-Ying Lai, Maria Pinango, Ashwini Deo
Abstract:
While grammatical devices are essential to language processing, how comprehension utilizes cognitive mechanisms is less emphasized. This study addresses this issue by probing the complement coercion phenomenon: an entity-denoting complement following verbs like begin and finish receives an eventive interpretation. For example, (1) “The queen began the book” receives an agentive reading like (2) “The queen began [reading/writing/etc.…] the book.” Such sentences engender additional processing cost in real-time comprehension. The traditional account attributes this cost to an operation that coerces the entity-denoting complement to an event, assuming that these verbs require eventive complements. However, in closer examination, examples like “Chapter 1 began the book” undermine this assumption. An alternative, Structured Individual (SI) hypothesis, proposes that the complement following aspectual verbs (AspV; e.g. begin, finish) is conceptualized as a structured individual, construed as an axis along various dimensions (e.g. spatial, eventive, temporal, informational). The composition of an animate subject and an AspV such as (1) engenders an ambiguity between an agentive reading along the eventive dimension like (2), and a constitutive reading along the informational/spatial dimension like (3) “[The story of the queen] began the book,” in which the subject is interpreted as a subpart of the complement denotation. Comprehenders need to resolve the ambiguity by searching contextual information, resulting in additional cost. To evaluate the SI hypothesis, a questionnaire was employed. Method: Target AspV sentences such as “Shakespeare began the volume.” were preceded by one of the following types of context sentence: (A) Agentive-biasing, in which an event was mentioned (…writers often read…), (C) Constitutive-biasing, in which a constitutive meaning was hinted (Larry owns collections of Renaissance literature.), (N) Neutral context, which allowed both interpretations. Thirty-nine native speakers of English were asked to (i) rate each context-target sentence pair from a 1~5 scale (5=fully understandable), and (ii) choose possible interpretations for the target sentence given the context. The SI hypothesis predicts that comprehension is harder for the Neutral condition, as compared to the biasing conditions because no contextual information is provided to resolve an ambiguity. Also, comprehenders should obtain the specific interpretation corresponding to the context type. Results: (A) Agentive-biasing and (C) Constitutive-biasing were rated higher than (N) Neutral conditions (p< .001), while all conditions were within the acceptable range (> 3.5 on the 1~5 scale). This suggests that when lacking relevant contextual information, semantic ambiguity decreases comprehensibility. The interpretation task shows that the participants selected the biased agentive/constitutive reading for condition (A) and (C) respectively. For the Neutral condition, the agentive and constitutive readings were chosen equally often. Conclusion: These findings support the SI hypothesis: the meaning of AspV sentences is conceptualized as a parthood relation involving structured individuals. We argue that semantic representation makes reference to spatial structured-ness (abstracted axis). To obtain an appropriate interpretation, comprehenders utilize contextual information to enrich the conceptual representation of the sentence in question. This study connects semantic structure to human’s conceptual structure, and provides a processing model that incorporates contextual retrieval.Keywords: ambiguity resolution, contextual retrieval, spatial structured-ness, structured individual
Procedia PDF Downloads 3332584 Comparison of Processing Conditions for Plasticized PVC and PVB
Authors: Michael Tupý, Jaroslav Císař, Pavel Mokrejš, Dagmar Měřínská, Alice Tesaříková-Svobodová
Abstract:
The worldwide problem is that the recycled PVB is wildly stored in landfills. However, PVB have very similar chemical properties such as PVC. Moreover, both of them are used in plasticized form. Thus, the thermal properties of plasticized PVC obtained from primary production and the PVB was obtained by recycling of windshields are compared. It is carried out in order to find degradable conditions and decide if blend of PVB/PVC can be processable together. Tested PVC contained 38 % of plasticizer diisononyl phthalate (DINP) and PVB was plasticized with 28 % of triethylene glycol, bis(2-ethylhexanoate) (3GO). Thermal and thermo-oxidative decomposition of both vinyl polymers are compared such as DSC and OOT analysis. The tensile strength analysis is added.Keywords: polyvinyl chloride, polyvinyl butyral, recycling, reprocessing, thermal analysis, decomposition
Procedia PDF Downloads 5152583 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method
Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek
Abstract:
Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow
Procedia PDF Downloads 133