Search results for: Mandarin Chinese processing
2656 A Mathematical-Based Formulation of EEG Fluctuations
Authors: Razi Khalafi
Abstract:
Brain is the information processing center of the human body. Stimuli in form of information are transferred to the brain and then brain makes the decision on how to respond to them. In this research we propose a new partial differential equation which analyses the EEG signals and make a relationship between the incoming stimuli and the brain response to them. In order to test the proposed model, a set of external stimuli applied to the model and the model’s outputs were checked versus the real EEG data. The results show that this model can model the EEG signal well. The proposed model is useful not only for modeling of the EEG signal in case external stimuli but it can be used for the modeling of brain response in case of internal stimuli.Keywords: Brain, stimuli, partial differential equation, response, eeg signal
Procedia PDF Downloads 4342655 The Effective Use of the Network in the Distributed Storage
Authors: Mamouni Mohammed Dhiya Eddine
Abstract:
This work aims at studying the exploitation of high-speed networks of clusters for distributed storage. Parallel applications running on clusters require both high-performance communications between nodes and efficient access to the storage system. Many studies on network technologies led to the design of dedicated architectures for clusters with very fast communications between computing nodes. Efficient distributed storage in clusters has been essentially developed by adding parallelization mechanisms so that the server(s) may sustain an increased workload. In this work, we propose to improve the performance of distributed storage systems in clusters by efficiently using the underlying high-performance network to access distant storage systems. The main question we are addressing is: do high-speed networks of clusters fit the requirements of a transparent, efficient and high-performance access to remote storage? We show that storage requirements are very different from those of parallel computation. High-speed networks of clusters were designed to optimize communications between different nodes of a parallel application. We study their utilization in a very different context, storage in clusters, where client-server models are generally used to access remote storage (for instance NFS, PVFS or LUSTRE). Our experimental study based on the usage of the GM programming interface of MYRINET high-speed networks for distributed storage raised several interesting problems. Firstly, the specific memory utilization in the storage access system layers does not easily fit the traditional memory model of high-speed networks. Secondly, client-server models that are used for distributed storage have specific requirements on message control and event processing, which are not handled by existing interfaces. We propose different solutions to solve communication control problems at the filesystem level. We show that a modification of the network programming interface is required. Data transfer issues need an adaptation of the operating system. We detail several propositions for network programming interfaces which make their utilization easier in the context of distributed storage. The integration of a flexible processing of data transfer in the new programming interface MYRINET/MX is finally presented. Performance evaluations show that its usage in the context of both storage and other types of applications is easy and efficient.Keywords: distributed storage, remote file access, cluster, high-speed network, MYRINET, zero-copy, memory registration, communication control, event notification, application programming interface
Procedia PDF Downloads 2222654 A Multi-Regional Structural Path Analysis of Virtual Water Flows Caused by Coal Consumption in China
Authors: Cuiyang Feng, Xu Tang, Yi Jin
Abstract:
Coal is the most important primary energy source in China, which exerts a significant influence on the rapid economic growth. However, it makes the water resources to be a constraint on coal industry development, on account of the reverse geographical distribution between coal and water. To ease the pressure on water shortage, the ‘3 Red Lines’ water policies were announced by the Chinese government, and then ‘water for coal’ plan was added to that policies in 2013. This study utilized a structural path analysis (SPA) based on the multi-regional input-output table to quantify the virtual water flows caused by coal consumption in different stages. Results showed that the direct water input (the first stage) was the highest amount in all stages of coal consumption, accounting for approximately 30% of total virtual water content. Regional analysis demonstrated that virtual water trade alleviated the pressure on water use for coal consumption in water shortage areas, but the import of virtual water was not from the areas which are rich in water. Sectoral analysis indicated that the direct inputs from the sectors of ‘production and distribution of electric power and heat power’ and ‘Smelting and pressing of metals’ took up the major virtual water flows, while the sectors of ‘chemical industry’ and ‘manufacture of non-metallic mineral products’ importantly but indirectly consumed the water. With the population and economic growth in China, the water demand-and-supply gap in coal consumption would be more remarkable. In additional to water efficiency improvement measures, the central government should adjust the strategies of the virtual water trade to address local water scarcity issues. Water resource as the main constraints should be highly considered in coal policy to promote the sustainable development of the coal industry.Keywords: coal consumption, multi-regional input-output model, structural path analysis, virtual water
Procedia PDF Downloads 3032653 EEG Diagnosis Based on Phase Space with Wavelet Transforms for Epilepsy Detection
Authors: Mohmmad A. Obeidat, Amjed Al Fahoum, Ayman M. Mansour
Abstract:
The recognition of an abnormal activity of the brain functionality is a vital issue. To determine the type of the abnormal activity either a brain image or brain signal are usually considered. Imaging localizes the defect within the brain area and relates this area with somebody functionalities. However, some functions may be disturbed without affecting the brain as in epilepsy. In this case, imaging may not provide the symptoms of the problem. A cheaper yet efficient approach that can be utilized to detect abnormal activity is the measurement and analysis of the electroencephalogram (EEG) signals. The main goal of this work is to come up with a new method to facilitate the classification of the abnormal and disorder activities within the brain directly using EEG signal processing, which makes it possible to be applied in an on-line monitoring system.Keywords: EEG, wavelet, epilepsy, detection
Procedia PDF Downloads 5382652 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting
Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero
Abstract:
In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling
Procedia PDF Downloads 1362651 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency
Procedia PDF Downloads 2712650 Effects of Acupuncture Treatment in Gait Parameters in Parkinson's Disease
Authors: Catarina Isabel Ramos Pereira, Jorge Machado, Begona Alonso Criado, Maria João Santos
Abstract:
Introduction: Gait disorders are one of the symptoms that have severe implications on the quality of life in Parkinson's disease (PD). Currently, there is no therapy to reverse or treat this condition. None of the drugs used in conventional medical treatment is entirely efficient, and all have a high incidence of side effects. Acupuncture treatment is believed to improve motor ability, but there is still little scientific evidence in individuals with PD. Aim: The aim of the study is to investigate the acute effect of acupuncture on gait parameters in Parkinson's disease. Methods: This is a randomized and controlled crossover study. The same individual patient was part of both the experimental (real acupuncture) and control group (false acupuncture/sham), and the sequence was randomized. Gait parameters were measured at two different moments, before and after treatment, using four force platforms as well as the collection of 3D markers positions taken by 11 cameras. Images were quantitatively analyzed using Qualisys Track Manager software that let us extract data related to the quality of gait and balance. Seven patients with the diagnosis of Parkinson's disease were included in the study. Results: Statistically significant differences were found in gait speed (p = 0.016), gait cadence (p = 0.006), support base width (p = 0.0001), medio-lateral oscillation (p = 0.017), left-right step length (p = 0.0002), and stride length: right-right (p = 0.0000) and left-left (p = 0.0018), time of left support phase (p = 0.029), right support phase (p = 0.025) and double support phase (p = 0.015), between the initial and final moments for the experimental group. Differences in right-left stride length were found for both groups. Conclusion: Our results show that acupuncture could enhance gait in Parkinson's disease patients. Deep research involving a larger number of volunteers should be accomplished to validate these encouraging findings.Keywords: acupuncture, traditional Chinese medicine, Parkinson's disease, gait
Procedia PDF Downloads 1732649 Application of Signature Verification Models for Document Recognition
Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova
Abstract:
In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.Keywords: signature recognition, biometric data, artificial intelligence, neural networks
Procedia PDF Downloads 1492648 Predictive Value of Hepatitis B Core-Related Antigen (HBcrAg) during Natural History of Hepatitis B Virus Infection
Authors: Yanhua Zhao, Yu Gou, Shu Feng, Dongdong Li, Chuanmin Tao
Abstract:
The natural history of HBV infection could experience immune tolerant (IT), immune clearance (IC), HBeAg-negative inactive/quienscent carrier (ENQ), and HBeAg-negative hepatitis (ENH). As current biomarkers for discriminating these four phases have some weaknesses, additional serological indicators are needed. Hepatits B core-related antigen (HBcrAg) encoded with precore/core gene contains denatured HBeAg, HBV core antigen (HBcAg) and a 22KDa precore protein (p22cr), which was demonstrated to have a close association with natural history of hepatitis B infection, but no specific cutoff values and diagnostic parameters to evaluate the diagnostic efficacy. This study aimed to clarify the distribution of HBcrAg levels and evaluate its diagnostic performance during the natural history of infection from a Western Chinese perspective. 294 samples collected from treatment-naïve chronic hepatitis B (CHB) patients in different phases (IT=64; IC=72; ENQ=100, and ENH=58). We detected the HBcrAg values and analyzed the relationship between HBcrAg and HBV DNA. HBsAg and other clinical parameters were quantitatively tested. HBcrAg levels of four phases were 9.30 log U/mL, 8.80 log U/mL, 3.00 log U/mL, and 5.10 logU/mL, respectively (p < 0.0001). Receiver operating characteristic curve analysis demonstrated that the area under curves (AUCs) of HBcrAg and quantitative HBsAg at cutoff values of 9.25 log U/mL and 4.355 log IU/mL for distinguishing IT from IC phases were 0.704 and 0.694, with sensitivity 76.39% and 59.72%, specificity 53.13% and 79.69%, respectively. AUCs of HBcrAg and quantitative HBsAg at cutoff values of 4.15 log U/mlmL and 2.395 log IU/mlmL for discriminating between ENQ and ENH phases were 0.931 and 0.653, with sensitivity 87.93% and 84%, specificity 91.38% and 39%, respectively. Therefore, HBcrAg levels varied significantly among four natural phases of HBV infection. It had higher predictive performance than quantitative HBsAg for distinguishing between ENQ-patients and ENH-patients and similar performance with HBsAg for the discrimination between IT and IC phases, which indicated that HBcrAg could be a potential serological marker for CHB.Keywords: chronic hepatitis B, hepatitis B core-related antigen, hepatitis B surface antigens, hepatitis B virus
Procedia PDF Downloads 4202647 The Prospect of Producing Hydrogen by Electrolysis of Idle Discharges of Water from Reservoirs and Recycling of Waste-Gas Condensates
Authors: Inom Sh. Normatov, Nurmakhmad Shermatov, Rajabali Barotov, Rano Eshankulova
Abstract:
The results of the studies for the hydrogen production by the application of water electrolysis and plasma-chemical processing of gas condensate-waste of natural gas production methods are presented. Thin coating covers the electrode surfaces in the process of water electrolysis. Therefore, water for electrolysis was first exposed to electrosedimentation. The threshold voltage is shifted to a lower value compared with the use of electrodes made of stainless steel. At electrolysis of electrosedimented water by use of electrodes from stainless steel, a significant amount of hydrogen is formed. Pyrolysis of gas condensates in the atmosphere of a nitrogen was followed by the formation of acetylene (3-7 vol.%), ethylene (4-8 vol.%), and pyrolysis carbon (10-15 wt.%).Keywords: electrolyze, gascondensate, hydrogen, pyrolysis
Procedia PDF Downloads 3112646 Purification of Bacillus Lipopeptides for Diverse Applications
Authors: Vivek Rangarajan, Kim G. Clarke
Abstract:
Bacillus lipopeptides are biosurfactants with wide ranging applications in the medical, food, agricultural, environmental and cosmetic industries. They are produced as a mix of three families, surfactin, iturin and fengycin, each comprising a large number of homologues of varying functionalities. Consequently, the method and degree of purification of the lipopeptide cocktail becomes particularly important if the functionality of the lipopeptide end-product is to be maximized for the specific application. However, downstream processing of Bacillus lipopeptides is particularly challenging due to the subtle variations observed in the different lipopeptide homologues and isoforms. To date, the most frequently used lipopeptide purification operations have been acid precipitation, solvent extraction, membrane ultrafiltration, adsorption and size exclusion. RP-HPLC (reverse phase high pressure liquid chromatography) also has potential for fractionation of the lipopeptide homologues. In the studies presented here, membrane ultrafiltration and RP-HPLC were evaluated for lipopeptide purification to different degrees of purities for maximum functionality. Batch membrane ultrafiltration using 50 kDa polyether sulphone (PES) membranes resulted in lipopeptide recovery of about 68% for surfactin and 82 % for fengycin. The recovery was further improved to 95% by using size-conditioned lipopeptide micelles. The conditioning of lipopeptides with Ca2+ ions resulted in uniformly sized micelles with average size of 96.4 nm and a polydispersity index of 0.18. The size conditioning also facilitated removal of impurities (molecular weight ranging between 2335-3500 Da) through operation of the system under dia-filtration mode, in a way similar to salt removal from protein by dialysis. The resultant purified lipopeptide was devoid of macromolecular impurities and could ideally suit applications in the cosmetic and food industries. Enhanced purification using RP-HPLC was carried out in an analytical C18 column, with the aim to fractionate lipopeptides into their constituent homologues. The column was eluted with mobile phase comprising acetonitrile and water over an acetonitrile gradient, 35% - 80%, over 70 minutes. The gradient elution program resulted in as many as 41 fractions of individual lipopeptide homologues. The efficacy test of these fractions against fungal phytopathogens showed that first 21 fractions, identified to be homologues of iturins and fengycins, displayed maximum antifungal activities, suitable for biocontrol in the agricultural industry. Thus, in the current study, the downstream processing of lipopeptides leading to tailor-made products for selective applications was demonstrated using two major downstream unit operations.Keywords: bacillus lipopeptides, membrane ultrafiltration, purification, RP-HPLC
Procedia PDF Downloads 2052645 Proximate Composition, Minerals and Sensory Attributes of Cake, Cookies, Cracker, and Chin-Chin Prepared from Cassava-Gari Residue Flour
Authors: Alice Nwanyioma Ohuoba, Rose Erdoo Kukwa, Ukpabi Joseph Ukpabi
Abstract:
Cassava root (Manihot esculenta) is one of the important carbohydrates containing crops in Nigeria. It is a staple food, mostly in the southern part of the country, and a source of income to farmers and processors. Cassava gari processing methods result to residue fiber (solid waste) from the sieving operation, these residue fibers ( solid wastes) can be dried and milled into flour and used to prepare cakes, cookies, crackers and chin-chin instead of being thrown away mostly on farmland or near the residential area. Flour for baking or frying may contain carbohydrates and protein (wheat flour) or rich in only carbohydrates (cassava flour). Cake, cookies, crackers, and chin-chin were prepared using the residue flour obtained from the residue fiber of cassava variety NR87184 roots, processed into gari. This study is aimed at evaluating the proximate composition, mineral content and sensory attributes of these selected snacks produced. The proximate composition results obtained showed that crackers had the lowest value in moisture (2.3390%) and fat (1.7130%), but highest in carbohydrates (85.2310%). Amongst the food products, cakes recorded the highest value in protein (8.0910%). Crude fibre values ranges from 2.5265% (cookies) to 3.4165% (crackers). The result of the mineral contents showed cookies ranking the highest in Phosphorus (65.8535 ppm) and Iron (0.1150 mg/L), Calcium (1.3800mg/L) and Potassium (7.2850 mg/L) contents, while chin-chin and crackers were lowest in Sodium ( 2.7000 mg/L). The food products were also subjected to sensory attributes evaluation by thirty member panelists using 9-hedonic scale which ranged from 1 ( dislike extremely) to 9 (like extremely). The means score obtained shows all the food products having above 7.00 (above “like moderately”). This study has shown that food products that may be functional or nutraceuticals could be prepared from the residue flour. There is a call for the use of gluten-free flour in baking due to ciliac disease and other allergic causes by gluten. Therefore local carbohydrates food crops like cassava residue flour that are gluten-free, could be the solution. In addition, this could aid cassava gari processing waste management thereby reducing post-harvest losses of cassava root.Keywords: allergy, flour, food-products, gluten-free
Procedia PDF Downloads 1572644 Processing and Characterization of (Pb0.55Ca0.45) (Fe0.5Nb0.5)O3 and (Pb0.45Ca0.55) (Fe0.5Nb0.5) O3 Dielectric Ceramics
Authors: Shalini Bahel, Maalti Puri, Sukhleen Bindra Narang
Abstract:
Ceramic samples of (Pb0.55Ca0.45) (Fe0.5Nb0.5)O3 and (Pb0.45Ca0.55)(Fe0.5Nb0.5)O3 were synthesized by columbite precursor method and characterized for structural and dielectric properties. Both the synthesized samples have perovskite structure with tetragonal symmetry. The variations in relative permittivity and loss tangent were measured as a function of frequency at room temperature. Both the relative permittivity and loss tangent decreased with increase in frequency. A reasonably high value of relative permittivity of 63.46, loss tangent of 0.0067 at 15 MHz and temperature coefficient of relative permittivity of -82 ppm/˚C was obtained for (Pb0.45Ca0.55) (Fe0.5Nb0.5) O3.Keywords: loss tangent, perovskite, relative permittivity, X-ray diffraction
Procedia PDF Downloads 2702643 Private Coded Computation of Matrix Multiplication
Authors: Malihe Aliasgari, Yousef Nejatbakhsh
Abstract:
The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers
Procedia PDF Downloads 1252642 Food Processing Role in Ensuring Food and Health Security
Authors: Muhammad Haseeb
Abstract:
It is crucial to have a balanced approach to food's energy and nutritional content in a world with limited resources. The preservation of the environment is vital, and both the agrifood-making and food service sectors will be requested to use fewer resources to produce a wider range of existing foods and develop imaginative foods that are physiologically appropriate for a better sense of good health, have long shelf lives and are conveniently transportable. Delivering healthy diets that satisfy consumer expectations from robust and sustainable agrifood systems is necessary in a world that is changing and where natural resources are running out. Across the whole food supply chain, an integrated multi-sectoral approach is needed to alleviate global food and nutrition insecurity.Keywords: health, food, nutrition, supply chain
Procedia PDF Downloads 252641 Anthropomorphic Brand Mascot Serve as the Vehicle: To Quickly Remind Customers Who You Are and What You Stand for in Indian Cultural Context
Authors: Preeti Yadav, Dandeswar Bisoyi, Debkumar Chakrabati
Abstract:
For many years organization have been exercising a creative technique of applying brand mascots, which results in making a visual ‘ambassador’ of a brand. The goal of mascot’s is just not confined to strengthening the brand identity, improving customer perception, but also acting as a vehicle of anthropomorphic translation towards the consumer. Such that it helps in embracing the power of recognition and processing the experiences happening in our daily lives. The study examines the relationship between the specific mascot features and brand attitude. It eliminates that mascot trust is an important mediator of the mascot features on brand attitude. Anthropomorphic characters turn out to be the key players despite the application of brand mascots in today’s marketing.Keywords: advertising, mascot, branding, recall
Procedia PDF Downloads 3362640 Technology for Good: Deploying Artificial Intelligence to Analyze Participant Response to Anti-Trafficking Education
Authors: Ray Bryant
Abstract:
3Strands Global Foundation (3SGF), a non-profit with a mission to mobilize communities to combat human trafficking through prevention education and reintegration programs, launched a groundbreaking study that calls out the usage and benefits of artificial intelligence in the war against human trafficking. Having gathered more than 30,000 stories from counselors and school staff who have gone through its PROTECT Prevention Education program, 3SGF sought to develop a methodology to measure the effectiveness of the training, which helps educators and school staff identify physical signs and behaviors indicating a student is being victimized. The program further illustrates how to recognize and respond to trauma and teaches the steps to take to report human trafficking, as well as how to connect victims with the proper professionals. 3SGF partnered with Levity, a leader in no-code Artificial Intelligence (AI) automation, to create the research study utilizing natural language processing, a branch of artificial intelligence, to measure the effectiveness of their prevention education program. By applying the logic created for the study, the platform analyzed and categorized each story. If the story, directly from the educator, demonstrated one or more of the desired outcomes; Increased Awareness, Increased Knowledge, or Intended Behavior Change, a label was applied. The system then added a confidence level for each identified label. The study results were generated with a 99% confidence level. Preliminary results show that of the 30,000 stories gathered, it became overwhelmingly clear that a significant majority of the participants now have increased awareness of the issue, demonstrated better knowledge of how to help prevent the crime, and expressed an intention to change how they approach what they do daily. In addition, it was observed that approximately 30% of the stories involved comments by educators expressing they wish they’d had this knowledge sooner as they can think of many students they would have been able to help. Objectives Of Research: To solve the problem of needing to analyze and accurately categorize more than 30,000 data points of participant feedback in order to evaluate the success of a human trafficking prevention program by using AI and Natural Language Processing. Methodologies Used: In conjunction with our strategic partner, Levity, we have created our own NLP analysis engine specific to our problem. Contributions To Research: The intersection of AI and human rights and how to utilize technology to combat human trafficking.Keywords: AI, technology, human trafficking, prevention
Procedia PDF Downloads 602639 The Folk Influences in the Melody of Romanian and Serbian Church Music
Authors: Eudjen Cinc
Abstract:
Common Byzantine origins of church music of Serbs and Romanians are certainly not the only reason for great similarities between the ways of singing of the two nations, especially in the region of Banat. If it was so, the differences between the interpretation of church music in this part of Orthodox religion and the one specific for other parts where Serbs or Romanians live could not be explained. What is it that connects church signing of two nations in this peaceful part of Europe to such an extent that it could be considered a comprehensive corpus, different from other 'Serbian' or 'Romanian' regions? This is the main issue dealt with in the text according to examples and comparative processing of material. The main aim of the paper is representation of the new and interesting, while its value lies in its potential to encourage the reader or a future researcher to investigate and search further.Keywords: folk influences, melody, melodic models, ethnomusicology
Procedia PDF Downloads 2542638 Research on Static and Dynamic Behavior of New Combination of Aluminum Honeycomb Panel and Rod Single-Layer Latticed Shell
Authors: Xu Chen, Zhao Caiqi
Abstract:
In addition to the advantages of light weight, resistant corrosion and ease of processing, aluminum is also applied to the long-span spatial structures. However, the elastic modulus of aluminum is lower than that of the steel. This paper combines the high performance aluminum honeycomb panel with the aluminum latticed shell, forming a new panel-and-rod composite shell structure. Through comparative analysis between the static and dynamic performance, the conclusion that the structure of composite shell is noticeably superior to the structure combined before.Keywords: combination of aluminum honeycomb panel, rod latticed shell, dynamic performence, response spectrum analysis, seismic properties
Procedia PDF Downloads 4752637 Analysis of Operation System Reorganization for Load Balancing of Parcel Sorting
Authors: J. H. Lee
Abstract:
As the internet and smartphone use increases, the E-Commerce is constantly growing. Therefore, the parcel is increasing continuously every year. If the larger amount than the processing capacity of the current facilities is received, they do not process, and the delivery quality becomes low. In this paper, therefore, we analyze comparatively at the cost perspective between the case of building a new facility for the increasing parcel volumes and the case of reorganizing the current operating system. We propose the optimal discount policy per parcel by calculating the construction cost of new automated facility and manual facilities until the construction of the new automated facility, and discount price.Keywords: system reorganization, load balancing, parcel sorting, discount policy
Procedia PDF Downloads 2702636 Examining the Development of Complexity, Accuracy and Fluency in L2 Learners' Writing after L2 Instruction
Authors: Khaled Barkaoui
Abstract:
Research on second-language (L2) learning tends to focus on comparing students with different levels of proficiency at one point in time. However, to understand L2 development, we need more longitudinal research. In this study, we adopt a longitudinal approach to examine changes in three indicators of L2 ability, complexity, accuracy, and fluency (CAF), as reflected in the writing of L2 learners when writing on different tasks before and after a period L2 instruction. Each of 85 Chinese learners of English at three levels of English language proficiency responded to two writing tasks (independent and integrated) before and after nine months of English-language study in China. Each essay (N= 276) was analyzed in terms of numerous CAF indices using both computer coding and human rating: number of words written, number of errors per 100 words, ratings of error severity, global syntactic complexity (MLS), complexity by coordination (T/S), complexity by subordination (C/T), clausal complexity (MLC), phrasal complexity (NP density), syntactic variety, lexical density, lexical variation, lexical sophistication, and lexical bundles. Results were then compared statistically across tasks, L2 proficiency levels, and time. Overall, task type had significant effects on fluency and some syntactic complexity indices (complexity by coordination, structural variety, clausal complexity, phrase complexity) and lexical density, sophistication, and bundles, but not accuracy. L2 proficiency had significant effects on fluency, accuracy, and lexical variation, but not syntactic complexity. Finally, fluency, frequency of errors, but not accuracy ratings, syntactic complexity indices (clausal complexity, global complexity, complexity by subordination, phrase complexity, structural variety) and lexical complexity (lexical density, variation, and sophistication) exhibited significant changes after instruction, particularly for the independent task. We discuss the findings and their implications for assessment, instruction, and research on CAF in the context of L2 writing.Keywords: second language writing, Fluency, accuracy, complexity, longitudinal
Procedia PDF Downloads 1532635 Evaluating Gene-Gene Interaction among Nicotine Dependence Genes on the Risk of Oral Clefts
Authors: Mengying Wang, Dongjing Liu, Holger Schwender, Ping Wang, Hongping Zhu, Tao Wu, Terri H Beaty
Abstract:
Background: Maternal smoking is a recognized risk factor for nonsyndromic cleft lip with or without cleft palate (NSCL/P). It has been reported that the effect of maternal smoking on oral clefts is mediated through genes that influence nicotine dependence. The polymorphisms of cholinergic receptor nicotinic alpha (CHRNA) and beta (CHRNB) subunits genes have previously shown strong associations with nicotine dependence. Here, we attempted to investigate whether the above genes are associated with clefting risk through testing for potential gene-gene (G×G) and gene-environment (G×E) interaction. Methods: We selected 120 markers in 14 genes associated with nicotine dependence to conduct transmission disequilibrium tests among 806 Chinese NSCL/P case-parent trios ascertained in an international consortium which conducted a genome-wide association study (GWAS) of oral clefts. We applied Cordell’s method using “TRIO” package in R to explore G×G as well as G×E interaction involving environmental tobacco smoke (ETS) based on conditional logistic regression model. Results: while no SNP showed significant association with NSCL/P after Bonferroni correction, we found signals for G×G interaction between 10 pairs of SNPs in CHRNA3, CHRNA5, and CHRNB4 (p<10-8), among which the most significant interaction was found between RS3743077 (CHRNA3) and RS11636753 (CHRNB4, p<8.2×10-12). Linkage disequilibrium (LD) analysis revealed only low level of LD between these markers. However, there were no significant results for G×ETS interaction. Conclusion: This study fails to detect association between nicotine dependence genes and NSCL/P, but illustrates the importance of taking into account potential G×G interaction for genetic association analysis in NSCL/P. This study also suggests nicotine dependence genes should be considered as important candidate genes for NSCL/P in future studies.Keywords: Gene-Gene Interaction, Maternal Smoking, Nicotine Dependence, Non-Syndromic Cleft Lip with or without Cleft Palate
Procedia PDF Downloads 3382634 Functional Neurocognitive Imaging (fNCI): A Diagnostic Tool for Assessing Concussion Neuromarker Abnormalities and Treating Post-Concussion Syndrome in Mild Traumatic Brain Injury Patients
Authors: Parker Murray, Marci Johnson, Tyson S. Burnham, Alina K. Fong, Mark D. Allen, Bruce McIff
Abstract:
Purpose: Pathological dysregulation of Neurovascular Coupling (NVC) caused by mild traumatic brain injury (mTBI) is the predominant source of chronic post-concussion syndrome (PCS) symptomology. fNCI has the ability to localize dysregulation in NVC by measuring blood-oxygen-level-dependent (BOLD) signaling during the performance of fMRI-adapted neuropsychological evaluations. With fNCI, 57 brain areas consistently affected by concussion were identified as PCS neural markers, which were validated on large samples of concussion patients and healthy controls. These neuromarkers provide the basis for a computation of PCS severity which is referred to as the Severity Index Score (SIS). The SIS has proven valuable in making pre-treatment decisions, monitoring treatment efficiency, and assessing long-term stability of outcomes. Methods and Materials: After being scanned while performing various cognitive tasks, 476 concussed patients received an SIS score based on the neural dysregulation of the 57 previously identified brain regions. These scans provide an objective measurement of attentional, subcortical, visual processing, language processing, and executive functioning abilities, which were used as biomarkers for post-concussive neural dysregulation. Initial SIS scores were used to develop individualized therapy incorporating cognitive, occupational, and neuromuscular modalities. These scores were also used to establish pre-treatment benchmarks and measure post-treatment improvement. Results: Changes in SIS were calculated in percent change from pre- to post-treatment. Patients showed a mean improvement of 76.5 percent (σ= 23.3), and 75.7 percent of patients showed at least 60 percent improvement. Longitudinal reassessment of 24 of the patients, measured an average of 7.6 months post-treatment, shows that SIS improvement is maintained and improved, with an average of 90.6 percent improvement from their original scan. Conclusions: fNCI provides a reliable measurement of NVC allowing for identification of concussion pathology. Additionally, fNCI derived SIS scores direct tailored therapy to restore NVC, subsequently resolving chronic PCS resulting from mTBI.Keywords: concussion, functional magnetic resonance imaging (fMRI), neurovascular coupling (NVC), post-concussion syndrome (PCS)
Procedia PDF Downloads 3622633 Tele-Monitoring and Logging of Patient Health Parameters Using Zigbee
Authors: Kirubasankar, Sanjeevkumar, Aravindh Nagappan
Abstract:
This paper addresses a system for monitoring patients using biomedical sensors and displaying it in a remote place. The main challenges in present health monitoring devices are lack of remote monitoring and logging for future evaluation. Typical instruments used for health parameter measurement provide basic information regarding health status. This paper identifies a set of design principles to address these challenges. This system includes continuous measurement of health parameters such as Heart rate, electrocardiogram, SpO2 level and Body temperature. The accumulated sensor data is relayed to a processing device using a transceiver and viewed by the implementation of cloud services.Keywords: bio-medical sensors, monitoring, logging, cloud service
Procedia PDF Downloads 5222632 Apatite Flotation Using Fruits' Oil as Collector and Sorghum as Depressant
Authors: Elenice Maria Schons Silva, Andre Carlos Silva
Abstract:
The crescent demand for raw material has increased mining activities. Mineral industry faces the challenge of process more complexes ores, with very small particles and low grade, together with constant pressure to reduce production costs and environment impacts. Froth flotation deserves special attention among the concentration methods for mineral processing. Besides its great selectivity for different minerals, flotation is a high efficient method to process fine particles. The process is based on the minerals surficial physicochemical properties and the separation is only possible with the aid of chemicals such as collectors, frothers, modifiers, and depressants. In order to use sustainable and eco-friendly reagents, oils extracted from three different vegetable species (pequi’s pulp, macauba’s nut and pulp, and Jatropha curcas) were studied and tested as apatite collectors. Since the oils are not soluble in water, an alkaline hydrolysis (or saponification), was necessary before their contact with the minerals. The saponification was performed at room temperature. The tests with the new collectors were carried out at pH 9 and Flotigam 5806, a synthetic mix of fatty acids industrially adopted as apatite collector manufactured by Clariant, was used as benchmark. In order to find a feasible replacement for cornstarch the flour and starch of a graniferous variety of sorghum was tested as depressant. Apatite samples were used in the flotation tests. XRF (X-ray fluorescence), XRD (X-ray diffraction), and SEM/EDS (Scanning Electron Microscopy with Energy Dispersive Spectroscopy) were used to characterize the apatite samples. Zeta potential measurements were performed in the pH range from 3.5 to 12.5. A commercial cornstarch was used as depressant benchmark. Four depressants dosages and pH values were tested. A statistical test was used to verify the pH, dosage, and starch type influence on the minerals recoveries. For dosages equal or higher than 7.5 mg/L, pequi oil recovered almost all apatite particles. In one hand, macauba’s pulp oil showed excellent results for all dosages, with more than 90% of apatite recovery, but in the other hand, with the nut oil, the higher recovery found was around 84%. Jatropha curcas oil was the second best oil tested and more than 90% of the apatite particles were recovered for the dosage of 7.5 mg/L. Regarding the depressant, the lower apatite recovery with sorghum starch were found for a dosage of 1,200 g/t and pH 11, resulting in a recovery of 1.99%. The apatite recovery for the same conditions as 1.40% for sorghum flour (approximately 30% lower). When comparing with cornstarch at the same conditions sorghum flour produced an apatite recovery 91% lower.Keywords: collectors, depressants, flotation, mineral processing
Procedia PDF Downloads 1542631 A Low-Area Fully-Reconfigurable Hardware Design of Fast Fourier Transform System for 3GPP-LTE Standard
Authors: Xin-Yu Shih, Yue-Qu Liu, Hong-Ru Chou
Abstract:
This paper presents a low-area and fully-reconfigurable Fast Fourier Transform (FFT) hardware design for 3GPP-LTE communication standard. It can fully support 32 different FFT sizes, up to 2048 FFT points. Besides, a special processing element is developed for making reconfigurable computing characteristics possible, while first-in first-out (FIFO) scheduling scheme design technique is proposed for hardware-friendly FIFO resource arranging. In a synthesis chip realization via TSMC 40 nm CMOS technology, the hardware circuit only occupies core area of 0.2325 mm2 and dissipates 233.5 mW at maximal operating frequency of 250 MHz.Keywords: reconfigurable, fast Fourier transform (FFT), single-path delay feedback (SDF), 3GPP-LTE
Procedia PDF Downloads 2792630 Parallel 2-Opt Local Search on GPU
Authors: Wen-Bao Qiao, Jean-Charles Créput
Abstract:
To accelerate the solution for large scale traveling salesman problems (TSP), a parallel 2-opt local search algorithm with simple implementation based on Graphics Processing Unit (GPU) is presented and tested in this paper. The parallel scheme is based on technique of data decomposition by dynamically assigning multiple K processors on the integral tour to treat K edges’ 2-opt local optimization simultaneously on independent sub-tours, where K can be user-defined or have a function relationship with input size N. We implement this algorithm with doubly linked list on GPU. The implementation only requires O(N) memory. We compare this parallel 2-opt local optimization against sequential exhaustive 2-opt search along integral tour on TSP instances from TSPLIB with more than 10000 cities.Keywords: parallel 2-opt, double links, large scale TSP, GPU
Procedia PDF Downloads 6292629 Teachers' Perceptions of Physical Education and Sports Calendar and Conducted in the Light of the Objective of the Lesson Approach Competencies
Authors: Chelali Mohammed
Abstract:
In the context of the application of the competency-based approach in the system educational Algeria, the price of physical education and sport must privilege the acquisition of learning approaches and especially the approach science, which from problem situations, research and develops him information processing and application of knowledge and know-how in new situations in the words of ‘JOHN DEWEY’ ‘learning by practice’. And to achieve these goals and make teaching more EPS motivating, consistent and concrete, it is appropriate to perform a pedagogical approach freed from the constraints and open to creativity and student-centered in the light of the competency approach adopted in the formal curriculum. This approach is not unusual, but we think it is a highly professional nature requires the competence of the teacher.Keywords: approach competencies, physical, education, teachers
Procedia PDF Downloads 6052628 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach
Authors: Kanika Gupta, Ashok Kumar
Abstract:
Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database
Procedia PDF Downloads 1732627 Advanced Techniques in Semiconductor Defect Detection: An Overview of Current Technologies and Future Trends
Authors: Zheng Yuxun
Abstract:
This review critically assesses the advancements and prospective developments in defect detection methodologies within the semiconductor industry, an essential domain that significantly affects the operational efficiency and reliability of electronic components. As semiconductor devices continue to decrease in size and increase in complexity, the precision and efficacy of defect detection strategies become increasingly critical. Tracing the evolution from traditional manual inspections to the adoption of advanced technologies employing automated vision systems, artificial intelligence (AI), and machine learning (ML), the paper highlights the significance of precise defect detection in semiconductor manufacturing by discussing various defect types, such as crystallographic errors, surface anomalies, and chemical impurities, which profoundly influence the functionality and durability of semiconductor devices, underscoring the necessity for their precise identification. The narrative transitions to the technological evolution in defect detection, depicting a shift from rudimentary methods like optical microscopy and basic electronic tests to more sophisticated techniques including electron microscopy, X-ray imaging, and infrared spectroscopy. The incorporation of AI and ML marks a pivotal advancement towards more adaptive, accurate, and expedited defect detection mechanisms. The paper addresses current challenges, particularly the constraints imposed by the diminutive scale of contemporary semiconductor devices, the elevated costs associated with advanced imaging technologies, and the demand for rapid processing that aligns with mass production standards. A critical gap is identified between the capabilities of existing technologies and the industry's requirements, especially concerning scalability and processing velocities. Future research directions are proposed to bridge these gaps, suggesting enhancements in the computational efficiency of AI algorithms, the development of novel materials to improve imaging contrast in defect detection, and the seamless integration of these systems into semiconductor production lines. By offering a synthesis of existing technologies and forecasting upcoming trends, this review aims to foster the dialogue and development of more effective defect detection methods, thereby facilitating the production of more dependable and robust semiconductor devices. This thorough analysis not only elucidates the current technological landscape but also paves the way for forthcoming innovations in semiconductor defect detection.Keywords: semiconductor defect detection, artificial intelligence in semiconductor manufacturing, machine learning applications, technological evolution in defect analysis
Procedia PDF Downloads 53