Search results for: relational processing
2561 A Mathematical-Based Formulation of EEG Fluctuations
Authors: Razi Khalafi
Abstract:
Brain is the information processing center of the human body. Stimuli in form of information are transferred to the brain and then brain makes the decision on how to respond to them. In this research we propose a new partial differential equation which analyses the EEG signals and make a relationship between the incoming stimuli and the brain response to them. In order to test the proposed model, a set of external stimuli applied to the model and the model’s outputs were checked versus the real EEG data. The results show that this model can model the EEG signal well. The proposed model is useful not only for modeling of the EEG signal in case external stimuli but it can be used for the modeling of brain response in case of internal stimuli.Keywords: Brain, stimuli, partial differential equation, response, eeg signal
Procedia PDF Downloads 4332560 The Effective Use of the Network in the Distributed Storage
Authors: Mamouni Mohammed Dhiya Eddine
Abstract:
This work aims at studying the exploitation of high-speed networks of clusters for distributed storage. Parallel applications running on clusters require both high-performance communications between nodes and efficient access to the storage system. Many studies on network technologies led to the design of dedicated architectures for clusters with very fast communications between computing nodes. Efficient distributed storage in clusters has been essentially developed by adding parallelization mechanisms so that the server(s) may sustain an increased workload. In this work, we propose to improve the performance of distributed storage systems in clusters by efficiently using the underlying high-performance network to access distant storage systems. The main question we are addressing is: do high-speed networks of clusters fit the requirements of a transparent, efficient and high-performance access to remote storage? We show that storage requirements are very different from those of parallel computation. High-speed networks of clusters were designed to optimize communications between different nodes of a parallel application. We study their utilization in a very different context, storage in clusters, where client-server models are generally used to access remote storage (for instance NFS, PVFS or LUSTRE). Our experimental study based on the usage of the GM programming interface of MYRINET high-speed networks for distributed storage raised several interesting problems. Firstly, the specific memory utilization in the storage access system layers does not easily fit the traditional memory model of high-speed networks. Secondly, client-server models that are used for distributed storage have specific requirements on message control and event processing, which are not handled by existing interfaces. We propose different solutions to solve communication control problems at the filesystem level. We show that a modification of the network programming interface is required. Data transfer issues need an adaptation of the operating system. We detail several propositions for network programming interfaces which make their utilization easier in the context of distributed storage. The integration of a flexible processing of data transfer in the new programming interface MYRINET/MX is finally presented. Performance evaluations show that its usage in the context of both storage and other types of applications is easy and efficient.Keywords: distributed storage, remote file access, cluster, high-speed network, MYRINET, zero-copy, memory registration, communication control, event notification, application programming interface
Procedia PDF Downloads 2192559 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions
Authors: Pirta Palola, Richard Bailey, Lisa Wedding
Abstract:
Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.Keywords: economics of biodiversity, environmental valuation, natural capital, value function
Procedia PDF Downloads 1942558 EEG Diagnosis Based on Phase Space with Wavelet Transforms for Epilepsy Detection
Authors: Mohmmad A. Obeidat, Amjed Al Fahoum, Ayman M. Mansour
Abstract:
The recognition of an abnormal activity of the brain functionality is a vital issue. To determine the type of the abnormal activity either a brain image or brain signal are usually considered. Imaging localizes the defect within the brain area and relates this area with somebody functionalities. However, some functions may be disturbed without affecting the brain as in epilepsy. In this case, imaging may not provide the symptoms of the problem. A cheaper yet efficient approach that can be utilized to detect abnormal activity is the measurement and analysis of the electroencephalogram (EEG) signals. The main goal of this work is to come up with a new method to facilitate the classification of the abnormal and disorder activities within the brain directly using EEG signal processing, which makes it possible to be applied in an on-line monitoring system.Keywords: EEG, wavelet, epilepsy, detection
Procedia PDF Downloads 5382557 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting
Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero
Abstract:
In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling
Procedia PDF Downloads 1352556 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency
Procedia PDF Downloads 2712555 Application of Signature Verification Models for Document Recognition
Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova
Abstract:
In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.Keywords: signature recognition, biometric data, artificial intelligence, neural networks
Procedia PDF Downloads 1482554 The Prospect of Producing Hydrogen by Electrolysis of Idle Discharges of Water from Reservoirs and Recycling of Waste-Gas Condensates
Authors: Inom Sh. Normatov, Nurmakhmad Shermatov, Rajabali Barotov, Rano Eshankulova
Abstract:
The results of the studies for the hydrogen production by the application of water electrolysis and plasma-chemical processing of gas condensate-waste of natural gas production methods are presented. Thin coating covers the electrode surfaces in the process of water electrolysis. Therefore, water for electrolysis was first exposed to electrosedimentation. The threshold voltage is shifted to a lower value compared with the use of electrodes made of stainless steel. At electrolysis of electrosedimented water by use of electrodes from stainless steel, a significant amount of hydrogen is formed. Pyrolysis of gas condensates in the atmosphere of a nitrogen was followed by the formation of acetylene (3-7 vol.%), ethylene (4-8 vol.%), and pyrolysis carbon (10-15 wt.%).Keywords: electrolyze, gascondensate, hydrogen, pyrolysis
Procedia PDF Downloads 3102553 Purification of Bacillus Lipopeptides for Diverse Applications
Authors: Vivek Rangarajan, Kim G. Clarke
Abstract:
Bacillus lipopeptides are biosurfactants with wide ranging applications in the medical, food, agricultural, environmental and cosmetic industries. They are produced as a mix of three families, surfactin, iturin and fengycin, each comprising a large number of homologues of varying functionalities. Consequently, the method and degree of purification of the lipopeptide cocktail becomes particularly important if the functionality of the lipopeptide end-product is to be maximized for the specific application. However, downstream processing of Bacillus lipopeptides is particularly challenging due to the subtle variations observed in the different lipopeptide homologues and isoforms. To date, the most frequently used lipopeptide purification operations have been acid precipitation, solvent extraction, membrane ultrafiltration, adsorption and size exclusion. RP-HPLC (reverse phase high pressure liquid chromatography) also has potential for fractionation of the lipopeptide homologues. In the studies presented here, membrane ultrafiltration and RP-HPLC were evaluated for lipopeptide purification to different degrees of purities for maximum functionality. Batch membrane ultrafiltration using 50 kDa polyether sulphone (PES) membranes resulted in lipopeptide recovery of about 68% for surfactin and 82 % for fengycin. The recovery was further improved to 95% by using size-conditioned lipopeptide micelles. The conditioning of lipopeptides with Ca2+ ions resulted in uniformly sized micelles with average size of 96.4 nm and a polydispersity index of 0.18. The size conditioning also facilitated removal of impurities (molecular weight ranging between 2335-3500 Da) through operation of the system under dia-filtration mode, in a way similar to salt removal from protein by dialysis. The resultant purified lipopeptide was devoid of macromolecular impurities and could ideally suit applications in the cosmetic and food industries. Enhanced purification using RP-HPLC was carried out in an analytical C18 column, with the aim to fractionate lipopeptides into their constituent homologues. The column was eluted with mobile phase comprising acetonitrile and water over an acetonitrile gradient, 35% - 80%, over 70 minutes. The gradient elution program resulted in as many as 41 fractions of individual lipopeptide homologues. The efficacy test of these fractions against fungal phytopathogens showed that first 21 fractions, identified to be homologues of iturins and fengycins, displayed maximum antifungal activities, suitable for biocontrol in the agricultural industry. Thus, in the current study, the downstream processing of lipopeptides leading to tailor-made products for selective applications was demonstrated using two major downstream unit operations.Keywords: bacillus lipopeptides, membrane ultrafiltration, purification, RP-HPLC
Procedia PDF Downloads 2052552 Proximate Composition, Minerals and Sensory Attributes of Cake, Cookies, Cracker, and Chin-Chin Prepared from Cassava-Gari Residue Flour
Authors: Alice Nwanyioma Ohuoba, Rose Erdoo Kukwa, Ukpabi Joseph Ukpabi
Abstract:
Cassava root (Manihot esculenta) is one of the important carbohydrates containing crops in Nigeria. It is a staple food, mostly in the southern part of the country, and a source of income to farmers and processors. Cassava gari processing methods result to residue fiber (solid waste) from the sieving operation, these residue fibers ( solid wastes) can be dried and milled into flour and used to prepare cakes, cookies, crackers and chin-chin instead of being thrown away mostly on farmland or near the residential area. Flour for baking or frying may contain carbohydrates and protein (wheat flour) or rich in only carbohydrates (cassava flour). Cake, cookies, crackers, and chin-chin were prepared using the residue flour obtained from the residue fiber of cassava variety NR87184 roots, processed into gari. This study is aimed at evaluating the proximate composition, mineral content and sensory attributes of these selected snacks produced. The proximate composition results obtained showed that crackers had the lowest value in moisture (2.3390%) and fat (1.7130%), but highest in carbohydrates (85.2310%). Amongst the food products, cakes recorded the highest value in protein (8.0910%). Crude fibre values ranges from 2.5265% (cookies) to 3.4165% (crackers). The result of the mineral contents showed cookies ranking the highest in Phosphorus (65.8535 ppm) and Iron (0.1150 mg/L), Calcium (1.3800mg/L) and Potassium (7.2850 mg/L) contents, while chin-chin and crackers were lowest in Sodium ( 2.7000 mg/L). The food products were also subjected to sensory attributes evaluation by thirty member panelists using 9-hedonic scale which ranged from 1 ( dislike extremely) to 9 (like extremely). The means score obtained shows all the food products having above 7.00 (above “like moderately”). This study has shown that food products that may be functional or nutraceuticals could be prepared from the residue flour. There is a call for the use of gluten-free flour in baking due to ciliac disease and other allergic causes by gluten. Therefore local carbohydrates food crops like cassava residue flour that are gluten-free, could be the solution. In addition, this could aid cassava gari processing waste management thereby reducing post-harvest losses of cassava root.Keywords: allergy, flour, food-products, gluten-free
Procedia PDF Downloads 1552551 Processing and Characterization of (Pb0.55Ca0.45) (Fe0.5Nb0.5)O3 and (Pb0.45Ca0.55) (Fe0.5Nb0.5) O3 Dielectric Ceramics
Authors: Shalini Bahel, Maalti Puri, Sukhleen Bindra Narang
Abstract:
Ceramic samples of (Pb0.55Ca0.45) (Fe0.5Nb0.5)O3 and (Pb0.45Ca0.55)(Fe0.5Nb0.5)O3 were synthesized by columbite precursor method and characterized for structural and dielectric properties. Both the synthesized samples have perovskite structure with tetragonal symmetry. The variations in relative permittivity and loss tangent were measured as a function of frequency at room temperature. Both the relative permittivity and loss tangent decreased with increase in frequency. A reasonably high value of relative permittivity of 63.46, loss tangent of 0.0067 at 15 MHz and temperature coefficient of relative permittivity of -82 ppm/˚C was obtained for (Pb0.45Ca0.55) (Fe0.5Nb0.5) O3.Keywords: loss tangent, perovskite, relative permittivity, X-ray diffraction
Procedia PDF Downloads 2692550 Private Coded Computation of Matrix Multiplication
Authors: Malihe Aliasgari, Yousef Nejatbakhsh
Abstract:
The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers
Procedia PDF Downloads 1222549 Food Processing Role in Ensuring Food and Health Security
Authors: Muhammad Haseeb
Abstract:
It is crucial to have a balanced approach to food's energy and nutritional content in a world with limited resources. The preservation of the environment is vital, and both the agrifood-making and food service sectors will be requested to use fewer resources to produce a wider range of existing foods and develop imaginative foods that are physiologically appropriate for a better sense of good health, have long shelf lives and are conveniently transportable. Delivering healthy diets that satisfy consumer expectations from robust and sustainable agrifood systems is necessary in a world that is changing and where natural resources are running out. Across the whole food supply chain, an integrated multi-sectoral approach is needed to alleviate global food and nutrition insecurity.Keywords: health, food, nutrition, supply chain
Procedia PDF Downloads 182548 Anthropomorphic Brand Mascot Serve as the Vehicle: To Quickly Remind Customers Who You Are and What You Stand for in Indian Cultural Context
Authors: Preeti Yadav, Dandeswar Bisoyi, Debkumar Chakrabati
Abstract:
For many years organization have been exercising a creative technique of applying brand mascots, which results in making a visual ‘ambassador’ of a brand. The goal of mascot’s is just not confined to strengthening the brand identity, improving customer perception, but also acting as a vehicle of anthropomorphic translation towards the consumer. Such that it helps in embracing the power of recognition and processing the experiences happening in our daily lives. The study examines the relationship between the specific mascot features and brand attitude. It eliminates that mascot trust is an important mediator of the mascot features on brand attitude. Anthropomorphic characters turn out to be the key players despite the application of brand mascots in today’s marketing.Keywords: advertising, mascot, branding, recall
Procedia PDF Downloads 3342547 Technology for Good: Deploying Artificial Intelligence to Analyze Participant Response to Anti-Trafficking Education
Authors: Ray Bryant
Abstract:
3Strands Global Foundation (3SGF), a non-profit with a mission to mobilize communities to combat human trafficking through prevention education and reintegration programs, launched a groundbreaking study that calls out the usage and benefits of artificial intelligence in the war against human trafficking. Having gathered more than 30,000 stories from counselors and school staff who have gone through its PROTECT Prevention Education program, 3SGF sought to develop a methodology to measure the effectiveness of the training, which helps educators and school staff identify physical signs and behaviors indicating a student is being victimized. The program further illustrates how to recognize and respond to trauma and teaches the steps to take to report human trafficking, as well as how to connect victims with the proper professionals. 3SGF partnered with Levity, a leader in no-code Artificial Intelligence (AI) automation, to create the research study utilizing natural language processing, a branch of artificial intelligence, to measure the effectiveness of their prevention education program. By applying the logic created for the study, the platform analyzed and categorized each story. If the story, directly from the educator, demonstrated one or more of the desired outcomes; Increased Awareness, Increased Knowledge, or Intended Behavior Change, a label was applied. The system then added a confidence level for each identified label. The study results were generated with a 99% confidence level. Preliminary results show that of the 30,000 stories gathered, it became overwhelmingly clear that a significant majority of the participants now have increased awareness of the issue, demonstrated better knowledge of how to help prevent the crime, and expressed an intention to change how they approach what they do daily. In addition, it was observed that approximately 30% of the stories involved comments by educators expressing they wish they’d had this knowledge sooner as they can think of many students they would have been able to help. Objectives Of Research: To solve the problem of needing to analyze and accurately categorize more than 30,000 data points of participant feedback in order to evaluate the success of a human trafficking prevention program by using AI and Natural Language Processing. Methodologies Used: In conjunction with our strategic partner, Levity, we have created our own NLP analysis engine specific to our problem. Contributions To Research: The intersection of AI and human rights and how to utilize technology to combat human trafficking.Keywords: AI, technology, human trafficking, prevention
Procedia PDF Downloads 592546 The Folk Influences in the Melody of Romanian and Serbian Church Music
Authors: Eudjen Cinc
Abstract:
Common Byzantine origins of church music of Serbs and Romanians are certainly not the only reason for great similarities between the ways of singing of the two nations, especially in the region of Banat. If it was so, the differences between the interpretation of church music in this part of Orthodox religion and the one specific for other parts where Serbs or Romanians live could not be explained. What is it that connects church signing of two nations in this peaceful part of Europe to such an extent that it could be considered a comprehensive corpus, different from other 'Serbian' or 'Romanian' regions? This is the main issue dealt with in the text according to examples and comparative processing of material. The main aim of the paper is representation of the new and interesting, while its value lies in its potential to encourage the reader or a future researcher to investigate and search further.Keywords: folk influences, melody, melodic models, ethnomusicology
Procedia PDF Downloads 2532545 Research on Static and Dynamic Behavior of New Combination of Aluminum Honeycomb Panel and Rod Single-Layer Latticed Shell
Authors: Xu Chen, Zhao Caiqi
Abstract:
In addition to the advantages of light weight, resistant corrosion and ease of processing, aluminum is also applied to the long-span spatial structures. However, the elastic modulus of aluminum is lower than that of the steel. This paper combines the high performance aluminum honeycomb panel with the aluminum latticed shell, forming a new panel-and-rod composite shell structure. Through comparative analysis between the static and dynamic performance, the conclusion that the structure of composite shell is noticeably superior to the structure combined before.Keywords: combination of aluminum honeycomb panel, rod latticed shell, dynamic performence, response spectrum analysis, seismic properties
Procedia PDF Downloads 4732544 Analysis of Operation System Reorganization for Load Balancing of Parcel Sorting
Authors: J. H. Lee
Abstract:
As the internet and smartphone use increases, the E-Commerce is constantly growing. Therefore, the parcel is increasing continuously every year. If the larger amount than the processing capacity of the current facilities is received, they do not process, and the delivery quality becomes low. In this paper, therefore, we analyze comparatively at the cost perspective between the case of building a new facility for the increasing parcel volumes and the case of reorganizing the current operating system. We propose the optimal discount policy per parcel by calculating the construction cost of new automated facility and manual facilities until the construction of the new automated facility, and discount price.Keywords: system reorganization, load balancing, parcel sorting, discount policy
Procedia PDF Downloads 2682543 Functional Neurocognitive Imaging (fNCI): A Diagnostic Tool for Assessing Concussion Neuromarker Abnormalities and Treating Post-Concussion Syndrome in Mild Traumatic Brain Injury Patients
Authors: Parker Murray, Marci Johnson, Tyson S. Burnham, Alina K. Fong, Mark D. Allen, Bruce McIff
Abstract:
Purpose: Pathological dysregulation of Neurovascular Coupling (NVC) caused by mild traumatic brain injury (mTBI) is the predominant source of chronic post-concussion syndrome (PCS) symptomology. fNCI has the ability to localize dysregulation in NVC by measuring blood-oxygen-level-dependent (BOLD) signaling during the performance of fMRI-adapted neuropsychological evaluations. With fNCI, 57 brain areas consistently affected by concussion were identified as PCS neural markers, which were validated on large samples of concussion patients and healthy controls. These neuromarkers provide the basis for a computation of PCS severity which is referred to as the Severity Index Score (SIS). The SIS has proven valuable in making pre-treatment decisions, monitoring treatment efficiency, and assessing long-term stability of outcomes. Methods and Materials: After being scanned while performing various cognitive tasks, 476 concussed patients received an SIS score based on the neural dysregulation of the 57 previously identified brain regions. These scans provide an objective measurement of attentional, subcortical, visual processing, language processing, and executive functioning abilities, which were used as biomarkers for post-concussive neural dysregulation. Initial SIS scores were used to develop individualized therapy incorporating cognitive, occupational, and neuromuscular modalities. These scores were also used to establish pre-treatment benchmarks and measure post-treatment improvement. Results: Changes in SIS were calculated in percent change from pre- to post-treatment. Patients showed a mean improvement of 76.5 percent (σ= 23.3), and 75.7 percent of patients showed at least 60 percent improvement. Longitudinal reassessment of 24 of the patients, measured an average of 7.6 months post-treatment, shows that SIS improvement is maintained and improved, with an average of 90.6 percent improvement from their original scan. Conclusions: fNCI provides a reliable measurement of NVC allowing for identification of concussion pathology. Additionally, fNCI derived SIS scores direct tailored therapy to restore NVC, subsequently resolving chronic PCS resulting from mTBI.Keywords: concussion, functional magnetic resonance imaging (fMRI), neurovascular coupling (NVC), post-concussion syndrome (PCS)
Procedia PDF Downloads 3552542 Tele-Monitoring and Logging of Patient Health Parameters Using Zigbee
Authors: Kirubasankar, Sanjeevkumar, Aravindh Nagappan
Abstract:
This paper addresses a system for monitoring patients using biomedical sensors and displaying it in a remote place. The main challenges in present health monitoring devices are lack of remote monitoring and logging for future evaluation. Typical instruments used for health parameter measurement provide basic information regarding health status. This paper identifies a set of design principles to address these challenges. This system includes continuous measurement of health parameters such as Heart rate, electrocardiogram, SpO2 level and Body temperature. The accumulated sensor data is relayed to a processing device using a transceiver and viewed by the implementation of cloud services.Keywords: bio-medical sensors, monitoring, logging, cloud service
Procedia PDF Downloads 5202541 Apatite Flotation Using Fruits' Oil as Collector and Sorghum as Depressant
Authors: Elenice Maria Schons Silva, Andre Carlos Silva
Abstract:
The crescent demand for raw material has increased mining activities. Mineral industry faces the challenge of process more complexes ores, with very small particles and low grade, together with constant pressure to reduce production costs and environment impacts. Froth flotation deserves special attention among the concentration methods for mineral processing. Besides its great selectivity for different minerals, flotation is a high efficient method to process fine particles. The process is based on the minerals surficial physicochemical properties and the separation is only possible with the aid of chemicals such as collectors, frothers, modifiers, and depressants. In order to use sustainable and eco-friendly reagents, oils extracted from three different vegetable species (pequi’s pulp, macauba’s nut and pulp, and Jatropha curcas) were studied and tested as apatite collectors. Since the oils are not soluble in water, an alkaline hydrolysis (or saponification), was necessary before their contact with the minerals. The saponification was performed at room temperature. The tests with the new collectors were carried out at pH 9 and Flotigam 5806, a synthetic mix of fatty acids industrially adopted as apatite collector manufactured by Clariant, was used as benchmark. In order to find a feasible replacement for cornstarch the flour and starch of a graniferous variety of sorghum was tested as depressant. Apatite samples were used in the flotation tests. XRF (X-ray fluorescence), XRD (X-ray diffraction), and SEM/EDS (Scanning Electron Microscopy with Energy Dispersive Spectroscopy) were used to characterize the apatite samples. Zeta potential measurements were performed in the pH range from 3.5 to 12.5. A commercial cornstarch was used as depressant benchmark. Four depressants dosages and pH values were tested. A statistical test was used to verify the pH, dosage, and starch type influence on the minerals recoveries. For dosages equal or higher than 7.5 mg/L, pequi oil recovered almost all apatite particles. In one hand, macauba’s pulp oil showed excellent results for all dosages, with more than 90% of apatite recovery, but in the other hand, with the nut oil, the higher recovery found was around 84%. Jatropha curcas oil was the second best oil tested and more than 90% of the apatite particles were recovered for the dosage of 7.5 mg/L. Regarding the depressant, the lower apatite recovery with sorghum starch were found for a dosage of 1,200 g/t and pH 11, resulting in a recovery of 1.99%. The apatite recovery for the same conditions as 1.40% for sorghum flour (approximately 30% lower). When comparing with cornstarch at the same conditions sorghum flour produced an apatite recovery 91% lower.Keywords: collectors, depressants, flotation, mineral processing
Procedia PDF Downloads 1522540 A Low-Area Fully-Reconfigurable Hardware Design of Fast Fourier Transform System for 3GPP-LTE Standard
Authors: Xin-Yu Shih, Yue-Qu Liu, Hong-Ru Chou
Abstract:
This paper presents a low-area and fully-reconfigurable Fast Fourier Transform (FFT) hardware design for 3GPP-LTE communication standard. It can fully support 32 different FFT sizes, up to 2048 FFT points. Besides, a special processing element is developed for making reconfigurable computing characteristics possible, while first-in first-out (FIFO) scheduling scheme design technique is proposed for hardware-friendly FIFO resource arranging. In a synthesis chip realization via TSMC 40 nm CMOS technology, the hardware circuit only occupies core area of 0.2325 mm2 and dissipates 233.5 mW at maximal operating frequency of 250 MHz.Keywords: reconfigurable, fast Fourier transform (FFT), single-path delay feedback (SDF), 3GPP-LTE
Procedia PDF Downloads 2782539 (Re)connecting to the Spirit of the Language: Decolonizing from Eurocentric Indigenous Language Revitalization Methodologies
Authors: Lana Whiskeyjack, Kyle Napier
Abstract:
The Spirit of the language embodies the motivation for indigenous people to connect with the indigenous language of their lineage. While the concept of the spirit of the language is often woven into the discussion by indigenous language revitalizationists, particularly those who are indigenous, there are few tangible terms in academic research conceptually actualizing the term. Through collaborative work with indigenous language speakers, elders, and learners, this research sets out to identify the spirit of the language, the catalysts of disconnection from the spirit of the language, and the sources of reconnection to the spirit of the language. This work fundamentally addresses the terms of engagement around collaboration with indigenous communities, itself inviting a decolonial approach to community outreach and individual relationships. As indigenous researchers, this means beginning, maintain, and closing this work in the ceremony while being transparent with community members in this work and related publishing throughout the project’s duration. Decolonizing this approach also requires maintaining explicit ongoing consent by the elders, knowledge keepers, and community members when handling their ancestral and indigenous knowledge. The handling of this knowledge is regarded in this work as stewardship, both in the handling of digital materials and the handling of ancestral Indigenous knowledge. This work observes recorded conversations in both nêhiyawêwin and English, resulting from 10 semi-structured interviews with fluent nêhiyawêwin speakers as well as three structured dialogue circles with fluent and emerging speakers. The words were transcribed by a speaker fluent in both nêhiyawêwin and English. The results of those interviews were categorized thematically to conceptually actualize the spirit of the language, catalysts of disconnection to thespirit of the language, and community voices methods of reconnection to the spirit of the language. Results of these interviews vastly determine that the spirit of the language is drawn from the land. Although nêhiyawêwin is the focus of this work, Indigenous languages are by nature inherently related to the land. This is further reaffirmed by the Indigenous language learners and speakers who expressed having ancestries and lineages from multiple Indigenous communities. Several other key differences embody this spirit of the language, which include ceremony and spirituality, as well as the semantic worldviews tied to polysynthetic verb-oriented morphophonemics most often found in indigenous languages — and of focus, nêhiyawêwin. The catalysts of disconnection to the spirit of the language are those whose histories have severed connections between Indigenous Peoples and the spirit of their languages or those that have affected relationships with the land, ceremony, and ways of thinking. Results of this research and its literature review have determined the three most ubiquitously damaging interdependent factors, which are catalysts of disconnection from the spirit of the language as colonization, capitalism, and Christianity. As voiced by the Indigenous language learners, this work necessitates addressing means to reconnect to the spirit of the language. Interviewees mentioned that the process of reconnection involves a whole relationship with the land, the practice of reciprocal-relational methodologies for language learning, and indigenous-protected and -governed learning. This work concludes in support of those reconnection methodologies.Keywords: indigenous language acquisition, indigenous language reclamation, indigenous language revitalization, nêhiyawêwin, spirit of the language
Procedia PDF Downloads 1432538 Journeys of Healing for Military Veterans: A Pilot Study
Authors: Heather Warfield, Brad Genereux
Abstract:
Military personnel encounter a number of challenges when separating from military service to include career uncertainty, relational/family dynamics, trauma as a result of military experiences, reconceptualization of identity, and existential issues related to purpose, meaning making and framing of the military experience(s). Embedded within military culture are well-defined rites of passage and a significant sense of belonging. Consequently, transition out of the military can result in the loss of such rites of passage and belongingness. However, a pilgrimage journey can provide the time and space to engage in a new rite of passage, to construct a new pilgrim identity, and a to develop deep social relationships that lead to a sense of belongingness to a particular pilgrim community as well as to the global community of pilgrims across numerous types of pilgrimage journeys. The aims of the current paper are to demonstrate the rationale for why pilgrimage journeys are particularly significant for military veterans, provide an overview of an innovative program that facilitates the Camino de Santiago pilgrimage for military veterans, and discusses the lessons learned from the initial pilot project of a recently established program. Veterans on the Camino (VOC) is an emerging nongovernmental organization in the USA. Founded by a military veteran, after leaving his military career, the primary objective of the organization is to facilitate healing for veterans via the Camino de Santiago pilgrimage journey. As part of the program, participants complete a semi-structured interview at three time points – pre, during, and post journey. The interview items are based on ongoing research by the principal investigator and address such constructs as meaning-making, wellbeing, therapeutic benefits and transformation. In addition, program participants complete The Sources of Meaning and Meaning in Life Questionnaire (SoMe). The pilot program occurred in the spring of 2017. Five participants were selected after an extensive application process and review by a three-person selection board. The selection criteria included demonstrated compatibility with the program objectives (i.e., prior military experience, availability for a 40 day journey, and awareness of the need for a transformational intervention). The participants were connected as a group through a private Facebook site and interacted with one another for several months prior to the pilgrimage. Additionally, the participants were interviewed prior to beginning the pilgrimage, at one point during the pilgrimage and immediately following the conclusion of the pilgrimage journey. The interviews yielded themes related to loss, meaning construction, renewed hope in humanity, and a commitment to future goals. The lessons learned from this pilot project included a confirmation of the need for such a program, a need for greater focus on logistical details, and the recognition that the pilgrimage experience needs to continue in some manner once the veterans return home.Keywords: pilgrimage, healing, military veterans, Camino de Santiago
Procedia PDF Downloads 2892537 Parallel 2-Opt Local Search on GPU
Authors: Wen-Bao Qiao, Jean-Charles Créput
Abstract:
To accelerate the solution for large scale traveling salesman problems (TSP), a parallel 2-opt local search algorithm with simple implementation based on Graphics Processing Unit (GPU) is presented and tested in this paper. The parallel scheme is based on technique of data decomposition by dynamically assigning multiple K processors on the integral tour to treat K edges’ 2-opt local optimization simultaneously on independent sub-tours, where K can be user-defined or have a function relationship with input size N. We implement this algorithm with doubly linked list on GPU. The implementation only requires O(N) memory. We compare this parallel 2-opt local optimization against sequential exhaustive 2-opt search along integral tour on TSP instances from TSPLIB with more than 10000 cities.Keywords: parallel 2-opt, double links, large scale TSP, GPU
Procedia PDF Downloads 6242536 Teachers' Perceptions of Physical Education and Sports Calendar and Conducted in the Light of the Objective of the Lesson Approach Competencies
Authors: Chelali Mohammed
Abstract:
In the context of the application of the competency-based approach in the system educational Algeria, the price of physical education and sport must privilege the acquisition of learning approaches and especially the approach science, which from problem situations, research and develops him information processing and application of knowledge and know-how in new situations in the words of ‘JOHN DEWEY’ ‘learning by practice’. And to achieve these goals and make teaching more EPS motivating, consistent and concrete, it is appropriate to perform a pedagogical approach freed from the constraints and open to creativity and student-centered in the light of the competency approach adopted in the formal curriculum. This approach is not unusual, but we think it is a highly professional nature requires the competence of the teacher.Keywords: approach competencies, physical, education, teachers
Procedia PDF Downloads 6032535 Advanced Techniques in Semiconductor Defect Detection: An Overview of Current Technologies and Future Trends
Authors: Zheng Yuxun
Abstract:
This review critically assesses the advancements and prospective developments in defect detection methodologies within the semiconductor industry, an essential domain that significantly affects the operational efficiency and reliability of electronic components. As semiconductor devices continue to decrease in size and increase in complexity, the precision and efficacy of defect detection strategies become increasingly critical. Tracing the evolution from traditional manual inspections to the adoption of advanced technologies employing automated vision systems, artificial intelligence (AI), and machine learning (ML), the paper highlights the significance of precise defect detection in semiconductor manufacturing by discussing various defect types, such as crystallographic errors, surface anomalies, and chemical impurities, which profoundly influence the functionality and durability of semiconductor devices, underscoring the necessity for their precise identification. The narrative transitions to the technological evolution in defect detection, depicting a shift from rudimentary methods like optical microscopy and basic electronic tests to more sophisticated techniques including electron microscopy, X-ray imaging, and infrared spectroscopy. The incorporation of AI and ML marks a pivotal advancement towards more adaptive, accurate, and expedited defect detection mechanisms. The paper addresses current challenges, particularly the constraints imposed by the diminutive scale of contemporary semiconductor devices, the elevated costs associated with advanced imaging technologies, and the demand for rapid processing that aligns with mass production standards. A critical gap is identified between the capabilities of existing technologies and the industry's requirements, especially concerning scalability and processing velocities. Future research directions are proposed to bridge these gaps, suggesting enhancements in the computational efficiency of AI algorithms, the development of novel materials to improve imaging contrast in defect detection, and the seamless integration of these systems into semiconductor production lines. By offering a synthesis of existing technologies and forecasting upcoming trends, this review aims to foster the dialogue and development of more effective defect detection methods, thereby facilitating the production of more dependable and robust semiconductor devices. This thorough analysis not only elucidates the current technological landscape but also paves the way for forthcoming innovations in semiconductor defect detection.Keywords: semiconductor defect detection, artificial intelligence in semiconductor manufacturing, machine learning applications, technological evolution in defect analysis
Procedia PDF Downloads 512534 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features
Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova
Abstract:
The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.Keywords: emotion recognition, facial recognition, signal processing, machine learning
Procedia PDF Downloads 3162533 Thoughts on the Informatization Technology Innovation of Cores and Samples in China
Authors: Honggang Qu, Rongmei Liu, Bin Wang, Yong Xu, Zhenji Gao
Abstract:
There is a big gap in the ability and level of the informatization technology innovation of cores and samples compared with developed countries. Under the current background of promoting the technology innovation, how to strengthen the informatization technology innovation of cores and samples for National Cores and Samples Archives, which is a national innovation research center, is an important research topic. The paper summarizes the development status of cores and samples informatization technology, and finds the gaps and deficiencies, and proposes the innovation research directions and content, including data extraction, recognition, processing, integration, application and so on, so as to provide some reference and guidance for the future innovation research of the archives and support better the geological technology innovation in China.Keywords: cores and samples;, informatization technology;, innovation;, suggestion
Procedia PDF Downloads 1262532 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 158