Search results for: data mining applications and discovery
26884 Feedback Preference and Practice of English Majors’ in Pronunciation Instruction
Authors: Claerchille Jhulia Robin
Abstract:
This paper discusses the perspective of ESL learners towards pronunciation instruction. It sought to determine how these learners view the type of feedback their speech teacher gives and its impact on their own classroom practice of providing feedback. This study utilized a quantitative-qualitative approach to the problem. The respondents were Education students majoring in English. A survey questionnaire and interview guide were used for data gathering. The data from the survey was tabulated using frequency count and the data from the interview were then transcribed and analyzed. Results showed that ESL learners favor immediate corrective feedback and they do not find any issue in being corrected in front of their peers. They also practice the same corrective technique in their own classroom.Keywords: ESL, feedback, learner perspective, pronunciation instruction
Procedia PDF Downloads 23726883 The Use of Beneficial Microorganisms from Diverse Environments for the Management of Aflatoxin in Maize
Authors: Mathias Twizeyimana, Urmila Adhikari, Julius P. Sserumaga, David Ingham
Abstract:
The management of aflatoxins (naturally occurring toxins produced by certain fungi, most importantly Aspergillus flavus and A. parasiticus) relies mostly on the use of best cultural practices and, in some cases, the use of the biological control consisting of atoxigenic strains inhibiting the toxigenic strains through competition resulting in considerable toxin reduction. At AgBiome, we have built a core collection of over 100,000 fully sequenced microbes from diverse environments and employ both the microbes and their sequences in the discovery of new biological products for disease and pest control. The most common approach to finding beneficial microbes consists of isolating microorganisms from samples collected from diverse environments, selecting antagonistic strains through empirical screening, studying modes of action, and stabilization through the formulation of selected microbial isolates. A total of 608 diverse bacterial strains were screened using a high-throughput assay (48-well assay) to identify strains that inhibit toxigenic A. flavus growth on maize kernels. Active strains in 48-well assay had their pathogen inhibiting activity confirmed using the Flask Assay and were concurrently tested for their ability to reduce the aflatoxin content in maize grains. Strains with best growth inhibition and reduction of aflatoxin were tested in the greenhouse and field trials. From the field trials, three bacterial strains, AFS000009 (Pseudomonas chlororaphis), AFS032321 (Bacillus subtilis), AFS024683 (Bacillus velezensis), had aflatoxin concentrations (ppb) values that were significantly lower than those of inoculated control. The identification of biological products with high efficacy in inhibiting pathogen growth and eventually reducing the aflatoxin content will provide a valuable alternative to control strategies used in aflatoxin contamination management.Keywords: aflatoxin, microorganism bacteria, biocontrol, beneficial microbes
Procedia PDF Downloads 18826882 Regulatory and Economic Challenges of AI Integration in Cyber Insurance
Authors: Shreyas Kumar, Mili Shangari
Abstract:
Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware
Procedia PDF Downloads 3726881 Using Deep Learning Real-Time Object Detection Convolution Neural Networks for Fast Fruit Recognition in the Tree
Authors: K. Bresilla, L. Manfrini, B. Morandi, A. Boini, G. Perulli, L. C. Grappadelli
Abstract:
Image/video processing for fruit in the tree using hard-coded feature extraction algorithms have shown high accuracy during recent years. While accurate, these approaches even with high-end hardware are computationally intensive and too slow for real-time systems. This paper details the use of deep convolution neural networks (CNNs), specifically an algorithm (YOLO - You Only Look Once) with 24+2 convolution layers. Using deep-learning techniques eliminated the need for hard-code specific features for specific fruit shapes, color and/or other attributes. This CNN is trained on more than 5000 images of apple and pear fruits on 960 cores GPU (Graphical Processing Unit). Testing set showed an accuracy of 90%. After this, trained data were transferred to an embedded device (Raspberry Pi gen.3) with camera for more portability. Based on correlation between number of visible fruits or detected fruits on one frame and the real number of fruits on one tree, a model was created to accommodate this error rate. Speed of processing and detection of the whole platform was higher than 40 frames per second. This speed is fast enough for any grasping/harvesting robotic arm or other real-time applications.Keywords: artificial intelligence, computer vision, deep learning, fruit recognition, harvesting robot, precision agriculture
Procedia PDF Downloads 42526880 Automatic Tagging and Accuracy in Assamese Text Data
Authors: Chayanika Hazarika Bordoloi
Abstract:
This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.Keywords: CRF, morphology, tagging, tagset
Procedia PDF Downloads 19826879 A Human Activity Recognition System Based on Sensory Data Related to Object Usage
Authors: M. Abdullah, Al-Wadud
Abstract:
Sensor-based activity recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.Keywords: Naïve Bayesian, based classification, activity recognition, sensor data, object-usage model
Procedia PDF Downloads 32626878 The Characteristics of Porcine Immune Synapse via Flow Cytometry and Transmission Electron Microscope
Authors: Ann Ying-An Chen, Yi-Lun Tsai, Hso-Chi Chaung
Abstract:
An understanding of pathogens and the immune system has played an utmost important role in agricultural research for the development of vaccinations. The immunological synapse, cell to cell interaction play a crucial role in triggering the body's immune system, such as activation between antigen-presenting cells (APCs) and different subsets of T-cell. If these interactions are regulated appropriately, the host has the ability to defend itself against a wide spectrum of infectious pathogens. The aim of this study is to establish and to characterize a porcine immune synapse system by co-culturing T cell/APC. In this study, blood samples were collected from specific-pathogen-free piglets, and peripheral blood mononuclear cells (PBMC) were separated by using Ficoll-Pague. The PBMC were then stained with CD4 (FITC) and CD25 (PE) antibodies. Different subsets of T cells sorted by fluorescence-activated cell sorting flow cytometer were co-cultured for 24 hrs with alveolar macrophages, and the profiles of cytokine secretion and mRNA transcription levels of Toll-like receptors were examined after. Results showed that the three stages of immune synapse were clearly visible and identified under both transmission and scanning electron microscope (TEM and SEM). The significant interaction differences in toll-like receptor expressions within the co-cultured cell system were observed. The TLR7 mRNA expressions in CD4+CD25- cells were lower than those in CD4+CD25+ and CD4 -CD25+. Interestingly, the IL-10 production levels in CD4+CD25- cells (7.732 pg/mL) were significantly higher than those of CD4+CD25+ (2.636 pg/mL) and CD4 -CD25+ (2.48 pg/mL). These findings demonstrated that a clear understanding of the porcine immune synapse system can contribute greatly for further investigations on the mechanism of T-cell activation, which can benefit in the discovery of potential adjuvant candidate or effective antigen epitopes in the development of vaccinations with high efficacy.Keywords: antigen-presenting cells, immune synapse, pig, T subsets, toll-like receptor
Procedia PDF Downloads 13126877 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field
Authors: Nastaran Moosavi, Mohammad Mokhtari
Abstract:
Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.Keywords: density, p-impedance, s-impedance, post-stack seismic inversion, pre-stack seismic inversion
Procedia PDF Downloads 32926876 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors
Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui
Abstract:
Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.Keywords: data-driven method, process control, anomaly detection, dimensionality reduction
Procedia PDF Downloads 30126875 Heterologous Expression of a Clostridium thermocellum Proteins and Assembly of Cellulosomes 'in vitro' for Biotechnology Applications
Authors: Jessica Pinheiro Silva, Brenda Rabello De Camargo, Daniel Gusmao De Morais, Eliane Ferreira Noronha
Abstract:
The utilization of lignocellulosic biomass as source of polysaccharides for industrial applications requires an arsenal of enzymes with different mode of action able to hydrolyze its complex and recalcitrant structure. Clostridium thermocellum is gram-positive, thermophilic bacterium producing lignocellulosic hydrolyzing enzymes in the form of multi-enzyme complex, termed celulossomes. This complex has several hydrolytic enzymes attached to a large and enzymically inactive protein known as Cellulosome-integrating protein (CipA), which serves as a scaffolding protein for the complex produced. This attachment occurs through specific interactions between cohesin modules of CipA and dockerin modules in enzymes. The present work aims to construct celulosomes in vitro with the structural protein CipA, a xylanase called Xyn10D and a cellulose called CelJ from C.thermocellum. A mini-scafoldin was constructed from modules derived from CipA containing two cohesion modules. This was cloned and expressed in Escherichia coli. The other two genes were cloned under the control of the alcohol oxidase 1 promoter (AOX1) in the vector pPIC9 and integrated into the genome of the methylotrophic yeast Pichia pastoris GS115. Purification of each protein is being carried out. Further studies regarding enzymatic activity of the cellulosome is going to be evaluated. The cellulosome built in vitro and composed of mini-CipA, CelJ and Xyn10D, can be very interesting for application in industrial processes involving the degradation of plant biomass.Keywords: cellulosome, CipA, Clostridium thermocellum, cohesin, dockerin, yeast
Procedia PDF Downloads 23726874 Application of GPRS in Water Quality Monitoring System
Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan
Abstract:
Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.Keywords: multiparameter sensor, GPRS, visual basic software, RS232
Procedia PDF Downloads 41826873 Ecosystem Model for Environmental Applications
Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru
Abstract:
This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision-making.Keywords: ecosystem model, environmental security, fuzzy logic, sustainability of habitable regions
Procedia PDF Downloads 42726872 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm
Authors: Anuradha Chug, Sunali Gandhi
Abstract:
Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm
Procedia PDF Downloads 38426871 Aligning the Sustainability Policy Areas for Decarbonisation and Value Addition at an Organisational Level
Authors: Bishal Baniya
Abstract:
This paper proposes the sustainability related policy areas for decarbonisation and value addition at an organizational level. General and public sector organizations around the world are usually significant in terms of consuming resources and producing waste – powered through their massive procurement capacity. However, these organizations also possess huge potential to cut resource use and emission as many of these organizations controls supply chain of goods/services. They can therefore be a trend setter and can easily lead other major economic sectors such as manufacturing, construction and mining, transportation, etc. in pursuit towards paradigm shift for sustainability. Whilst the environmental and social awareness has improved in recent years and they have identified policy areas to improve the organizational environmental performance, value addition to the core business of the organization hasn’t been understood and interpreted correctly. This paper therefore investigates ways to align sustainability policy measures in a way that it creates better value proposition relative to benchmark by accounting both eco and social efficiency. Preliminary analysis shows co-benefits other than resource and cost savings fosters the business cases for organizations and this can be achieved by better aligning the policy measures and engaging stakeholders.Keywords: policy measures, environmental performance, value proposition, organisational level
Procedia PDF Downloads 15426870 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization
Authors: Agria Rhamdhan
Abstract:
WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.Keywords: forensics, triage, visualization, WhatsApp
Procedia PDF Downloads 17326869 Low Cost Webcam Camera and GNSS Integration for Updating Home Data Using AI Principles
Authors: Mohkammad Nur Cahyadi, Hepi Hapsari Handayani, Agus Budi Raharjo, Ronny Mardianto, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
PDAM (local water company) determines customer charges by considering the customer's building or house. Charges determination significantly affects PDAM income and customer costs because the PDAM applies a subsidy policy for customers classified as small households. Periodic updates are needed so that pricing is in line with the target. A thorough customer survey in Surabaya is needed to update customer building data. However, the survey that has been carried out so far has been by deploying officers to conduct one-by-one surveys for each PDAM customer. Surveys with this method require a lot of effort and cost. For this reason, this research offers a technology called moblie mapping, a mapping method that is more efficient in terms of time and cost. The use of this tool is also quite simple, where the device will be installed in the car so that it can record the surrounding buildings while the car is running. Mobile mapping technology generally uses lidar sensors equipped with GNSS, but this technology requires high costs. In overcoming this problem, this research develops low-cost mobile mapping technology using a webcam camera sensor added to the GNSS and IMU sensors. The camera used has specifications of 3MP with a resolution of 720 and a diagonal field of view of 78⁰. The principle of this invention is to integrate four camera sensors, a GNSS webcam, and GPS to acquire photo data, which is equipped with location data (latitude, longitude) and IMU (roll, pitch, yaw). This device is also equipped with a tripod and a vacuum cleaner to attach to the car's roof so it doesn't fall off while running. The output data from this technology will be analyzed with artificial intelligence to reduce similar data (Cosine Similarity) and then classify building types. Data reduction is used to eliminate similar data and maintain the image that displays the complete house so that it can be processed for later classification of buildings. The AI method used is transfer learning by utilizing a trained model named VGG-16. From the analysis of similarity data, it was found that the data reduction reached 50%. Then georeferencing is done using the Google Maps API to get address information according to the coordinates in the data. After that, geographic join is done to link survey data with customer data already owned by PDAM Surya Sembada Surabaya.Keywords: mobile mapping, GNSS, IMU, similarity, classification
Procedia PDF Downloads 8526868 An Investigation into the Views of Distant Science Education Students Regarding Teaching Laboratory Work Online
Authors: Abraham Motlhabane
Abstract:
This research analysed the written views of science education students regarding the teaching of laboratory work using the online mode. The research adopted the qualitative methodology. The qualitative research was aimed at investigating small and distinct groups normally regarded as a single-site study. Qualitative research was used to describe and analyze the phenomena from the student’s perspective. This means the research began with assumptions of the world view that use theoretical lenses of research problems inquiring into the meaning of individual students. The research was conducted with three groups of students studying for Postgraduate Certificate in Education, Bachelor of Education and honors Bachelor of Education respectively. In each of the study programmes, the science education module is compulsory. Five science education students from each study programme were purposively selected to participate in this research. Therefore, 15 students participated in the research. In order to analysis the data, the data were first printed and hard copies were used in the analysis. The data was read several times and key concepts and ideas were highlighted. Themes and patterns were identified to describe the data. Coding as a process of organising and sorting data was used. The findings of the study are very diverse; some students are in favour of online laboratory whereas other students argue that science can only be learnt through hands-on experimentation.Keywords: online learning, laboratory work, views, perceptions
Procedia PDF Downloads 15226867 Brand Positioning in Iran: A Case Study of the Professional Soccer League
Authors: Homeira Asadi Kavan, Seyed Nasrollah Sajjadi, Mehrzade Hamidi, Hossein Rajabi, Mahdi Bigdely
Abstract:
Positioning strategies of a sports brand can create a unique impression in the minds of the fans, sponsors, and other stakeholders. In order to influence potential customer's perception in an effective and positive way, a brands positioning strategy must be unique, credible, and relevant. Many sports clubs in Iran have been struggling to implement and achieve brand positioning accomplishments, due to different reasons such as lack of experience, scarcity of experts in the sports branding, and lack of related researches in this field. This study will provide a comprehensive theoretical framework and action plan for sport managers and marketers to design and implement effective brand positioning and to enable them to be distinguishable from competing brands and sports clubs. The study instrument is interviews with sports marketing and brand experts who have been working in this industry for a minimum of 20 years. Qualitative data analysis was performed using Atlast.ti text mining software version 7 and Open, axial and selective coding were employed to uncover and systematically analyze important and complex phenomena and elements. The findings show 199 effective elements in positioning strategies in Iran Professional Soccer League. These elements are categorized into 23 concepts and sub-categories as follows: Structural prerequisites, Strategic management prerequisites, Commercial prerequisites, Major external prerequisites, Brand personality, Club symbols, Emotional aspects, Event aspects, Fans’ strategies, Marketing information strategies, Marketing management strategies, Empowerment strategies, Executive management strategies, League context, Fans’ background, Market context, Club’s organizational context, Support context, Major contexts, Political-Legal elements, Economic factors, Social factors, and Technological factors. Eventually, the study model was developed by 6 main dimensions of Causal prerequisites, Axial Phenomenon (brand position), Strategies, Context Factors, Interfering Factors, and Consequences. Based on the findings, practical recommendations and strategies are suggested that can help club managers and marketers in developing and improving their respective sport clubs, brand positioning, and activities.Keywords: brand positioning, soccer club, sport marketing, Iran professional soccer league, brand strategy
Procedia PDF Downloads 13926866 Potential of Hyperion (EO-1) Hyperspectral Remote Sensing for Detection and Mapping Mine-Iron Oxide Pollution
Authors: Abderrazak Bannari
Abstract:
Acid Mine Drainage (AMD) from mine wastes and contaminations of soils and water with metals are considered as a major environmental problem in mining areas. It is produced by interactions of water, air, and sulphidic mine wastes. This environment problem results from a series of chemical and biochemical oxidation reactions of sulfide minerals e.g. pyrite and pyrrhotite. These reactions lead to acidity as well as the dissolution of toxic and heavy metals (Fe, Mn, Cu, etc.) from tailings waste rock piles, and open pits. Soil and aquatic ecosystems could be contaminated and, consequently, human health and wildlife will be affected. Furthermore, secondary minerals, typically formed during weathering of mine waste storage areas when the concentration of soluble constituents exceeds the corresponding solubility product, are also important. The most common secondary mineral compositions are hydrous iron oxide (goethite, etc.) and hydrated iron sulfate (jarosite, etc.). The objectives of this study focus on the detection and mapping of MIOP in the soil using Hyperion EO-1 (Earth Observing - 1) hyperspectral data and constrained linear spectral mixture analysis (CLSMA) algorithm. The abandoned Kettara mine, located approximately 35 km northwest of Marrakech city (Morocco) was chosen as study area. During 44 years (from 1938 to 1981) this mine was exploited for iron oxide and iron sulphide minerals. Previous studies have shown that Kettara surrounding soils are contaminated by heavy metals (Fe, Cu, etc.) as well as by secondary minerals. To achieve our objectives, several soil samples representing different MIOP classes have been resampled and located using accurate GPS ( ≤ ± 30 cm). Then, endmembers spectra were acquired over each sample using an Analytical Spectral Device (ASD) covering the spectral domain from 350 to 2500 nm. Considering each soil sample separately, the average of forty spectra was resampled and convolved using Gaussian response profiles to match the bandwidths and the band centers of the Hyperion sensor. Moreover, the MIOP content in each sample was estimated by geochemical analyses in the laboratory, and a ground truth map was generated using simple Kriging in GIS environment for validation purposes. The acquired and used Hyperion data were corrected for a spatial shift between the VNIR and SWIR detectors, striping, dead column, noise, and gain and offset errors. Then, atmospherically corrected using the MODTRAN 4.2 radiative transfer code, and transformed to surface reflectance, corrected for sensor smile (1-3 nm shift in VNIR and SWIR), and post-processed to remove residual errors. Finally, geometric distortions and relief displacement effects were corrected using a digital elevation model. The MIOP fraction map was extracted using CLSMA considering the entire spectral range (427-2355 nm), and validated by reference to the ground truth map generated by Kriging. The obtained results show the promising potential of the proposed methodology for the detection and mapping of mine iron oxide pollution in the soil.Keywords: hyperion eo-1, hyperspectral, mine iron oxide pollution, environmental impact, unmixing
Procedia PDF Downloads 23126865 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP
Procedia PDF Downloads 32326864 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays
Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal
Abstract:
Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).Keywords: fault tolerance, FPGA, single event upset, approximate computing
Procedia PDF Downloads 20026863 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 32726862 Analysing the Permanent Deformation of Cohesive Subsoil Subject to Long Term Cyclic Train Loading
Authors: Natalie M. Wride, Xueyu Geng
Abstract:
Subgrade soils of railway infrastructure are subjected to a significant number of load applications over their design life. The use of slab track on existing and future proposed rail links requires a reduced maintenance and repair regime for the embankment subgrade, due to restricted access to the subgrade soils for remediation caused by cyclic deformation. It is, therefore, important to study the deformation behaviour of soft cohesive subsoils induced as a result of long term cyclic loading. In this study, a series of oedometer tests and cyclic triaxial tests (10,000 cycles) have been undertaken to investigate the undrained deformation behaviour of soft kaolin. X-ray Computer Tomography (CT) scanning of the samples has been performed to determine the change in porosity and soil structure density from the sample microstructure as a result of the laboratory testing regime undertaken. Combined with the examination of excess pore pressures and strains obtained from the cyclic triaxial tests, the results are compared with an existing analytical solution for long term settlement considering repeated low amplitude loading. Modifications to the analytical solution are presented based on the laboratory analysis that shows good agreement with further test data.Keywords: creep, cyclic loading, deformation, long term settlement, train loading
Procedia PDF Downloads 30126861 Expansion of Cord Blood Cells Using a Mix of Neurotrophic Factors
Authors: Francisco Dos Santos, Diogo Fonseca-Pereira, Sílvia Arroz-Madeira, Henrique Veiga-Fernandes
Abstract:
Haematopoiesis is a developmental process that generates all blood cell lineages in health and disease. This relies on quiescent haematopoietic stem cells (HSCs) that are able to differentiate, self renew and expand upon physiological demand. HSCs have great interest in regenerative medicine, including haematological malignancies, immunodeficiencies and metabolic disorders. However, the limited yield from existing HSC sources drives the global need for reliable techniques to expand harvested HSCs at high quality and sufficient quantities. With the extensive use of cord blood progenitors for clinical applications, there is a demand for a safe and efficient expansion protocol that is able to overcome the limitations of the cord blood as a source of HSC. StemCell2MAXTM developed a technology that enhances the survival, proliferation and transplantation efficiency of HSC, leading the way to a more widespread use of HSC for research and clinical purposes. StemCell2MAXTM MIX is a solution that improves HSC expansion up to 20x, while preserving stemness, when compared to state-of-the-art. In a recent study by a leading cord blood bank, StemCell2MAX MIX was shown to support a selective 100-fold expansion of CD34+ Hematopoietic Stem and Progenitor Cells (when compared to a 10-fold expansion of Total Nucleated Cells), while maintaining their multipotent differentiative potential as assessed by CFU assays. The technology developed by StemCell2MAXTM opens new horizons for the usage of expanded hematopoietic progenitors for both research purposes (including quality and functional assays in Cord Blood Banks) and clinical applications.Keywords: cord blood, expansion, hematopoietic stem cell, transplantation
Procedia PDF Downloads 27226860 Natural Fibers Design Attributes
Authors: Brayan S. Pabón, R. Ricardo Moreno, Edith Gonzalez
Abstract:
Inside the wide Colombian natural fiber set is the banana stem leaf, known as Calceta de Plátano, which is a material present in several regions of the country and is a fiber extracted from the pseudo stem of the banana plant (Musa paradisiaca) as a regular maintenance process. Colombia had a production of 2.8 million tons in 2007 and 2008 corresponding to 8.2% of the international production, number that is growing. This material was selected to be studied because it is not being used by farmers due to it being perceived as a waste from the banana harvest and a propagation pest agent inside the planting. In addition, the Calceta does not have industrial applications in Colombia since there is not enough concrete knowledge that informs us about the properties of the material and the possible applications it could have. Based on this situation the industrial design is used as a link between the properties of the material and the need to transform it into industrial products for the market. Therefore, the project identifies potential design attributes that the banana stem leaf can have for product development. The methodology was divided into 2 main chapters: Methodology for the material recognition: -Data Collection, inquiring the craftsmen experience and bibliography. -Knowledge in practice, with controlled experiments and validation tests. -Creation of design attributes and material profile according to the knowledge developed. Moreover, the Design methodology: -Application fields selection, exploring the use of the attributes and the relation with product functions. -Evaluating the possible fields and selection of the optimum application. -Design Process with sketching, ideation, and product development. Different protocols were elaborated to qualitatively determine some material properties of the Calceta, and if they could be designated as design attributes. Once defined, performed and analyzed the validation protocols, 25 design attributes were identified and classified into 4 attribute categories (Environmental, Functional, Aesthetics and Technical) forming the material profile. Then, 15 application fields were defined based on the relation between functions of product and the use of the Calceta attributes. Those fields were evaluated to measure how much are being used the functional attributes. After fields evaluation, a final field was definedKeywords: banana stem leaf, Calceta de Plátano, design attributes, natural fibers, product design
Procedia PDF Downloads 26226859 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Different Crops
Authors: M. M. Ali, Ahmed Al- Ani, Derek Eamus, Daniel K. Y. Tan
Abstract:
In this glasshouse study, we developed the new image-based non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. Plants were allowed to grow on nutrient media containing different P concentrations, i.e. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P as NaH2PO4). After 10 weeks of growth, plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. This data was further used in the linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using the image and morphological data. Our proposed non-destructive imaging method is precise in estimating P requirements of different crop species.Keywords: image-based techniques, leaf area, leaf P contents, linear discriminant analysis
Procedia PDF Downloads 38626858 The Use of Polar Substituent Groups for Promoting Azo Disperse Dye Solubility and Reactivity for More Economic and Environmental Benign Applications: A Computational Study
Authors: Olaide O. Wahab, Lukman O. Olasunkanmi, Krishna K. Govender, Penny P. Govender
Abstract:
The economic and environmental challenges associated with azo disperse dyes applications are due to poor aqueous solubility and low degradation tendency which stems from low chemical reactivity. Poor aqueous solubility property of this group of dyes necessitates the use of dispersing agents which increase operational costs and also release toxic chemical components into the environment, while their low degradation tendency is due to the high stability of the azo functional group (-N=N-) in their chemical structures. To address these problems, this study investigated theoretically the effects of some polar substituents on the aqueous solubility and reactivity properties of disperse yellow (DY) 119 dye with a view to theoretically develop new azo disperse dyes with improved solubility in water and higher degradation tendency in the environment using DMol³ computational code. All calculations were carried out using the Becke and Perdew version of Volsko-Wilk-Nusair (VWN-BP) level of density functional theory in conjunction with double numerical basis set containing polarization function (DNP). The aqueous solubility determination was achieved with conductor-like screening model for realistic solvation (COSMO-RS) in conjunction with known empirical solubility model, while the reactivity was predicted using frontier molecular orbital calculations. Most of the new derivatives studied showed evidence of higher aqueous solubility and degradation tendency compared to the parent dye. We conclude that these derivatives are promising alternative dyes for more economic and environmental benign dyeing practice and therefore recommend them for synthesis.Keywords: aqueous solubility, azo disperse dye, degradation, disperse yellow 119, DMol³, reactivity
Procedia PDF Downloads 20826857 Design of Visual Repository, Constraint and Process Modeling Tool Based on Eclipse Plug-Ins
Authors: Rushiraj Heshi, Smriti Bhandari
Abstract:
Master Data Management requires creation of Central repository, applying constraints on Repository and designing processes to manage data. Designing of Repository, constraints on repository and business processes is very tedious and time consuming task for large Enterprise. Hence Visual Repository, constraints and Process (Workflow) modeling is the most critical step in Master Data Management.In this paper, we realize a Visual Modeling tool for implementing Repositories, Constraints and Processes based on Eclipse Plugin using GMF/EMF which follows principles of Model Driven Engineering (MDE).Keywords: EMF, GMF, GEF, repository, constraint, process
Procedia PDF Downloads 50126856 Major Depressive Disorder: Diagnosis based on Electroencephalogram Analysis
Authors: Wajid Mumtaz, Aamir Saeed Malik, Syed Saad Azhar Ali, Mohd Azhar Mohd Yasin
Abstract:
In this paper, a technique based on electroencephalogram (EEG) analysis is presented, aiming for diagnosing major depressive disorder (MDD) among a potential population of MDD patients and healthy controls. EEG is recognized as a clinical modality during applications such as seizure diagnosis, index for anesthesia, detection of brain death or stroke. However, its usability for psychiatric illnesses such as MDD is less studied. Therefore, in this study, for the sake of diagnosis, 2 groups of study participants were recruited, 1) MDD patients, 2) healthy people as controls. EEG data acquired from both groups were analyzed involving inter-hemispheric asymmetry and composite permutation entropy index (CPEI). To automate the process, derived quantities from EEG were utilized as inputs to classifier such as logistic regression (LR) and support vector machine (SVM). The learning of these classification models was tested with a test dataset. Their learning efficiency is provided as accuracy of classifying MDD patients from controls, their sensitivities and specificities were reported, accordingly (LR =81.7 % and SVM =81.5 %). Based on the results, it is concluded that the derived measures are indicators for diagnosing MDD from a potential population of normal controls. In addition, the results motivate further exploring other measures for the same purpose.Keywords: major depressive disorder, diagnosis based on EEG, EEG derived features, CPEI, inter-hemispheric asymmetry
Procedia PDF Downloads 54926855 Strategies Used by the Saffron Producers of Taliouine (Morocco) to Adapt to Climate Change
Authors: Aziz Larbi, Widad Sadok
Abstract:
In Morocco, the mountainous regions extend over about 26% of the national territory where 30% of the total population live. They contain opportunities for agriculture, forestry, pastureland and mining. The production systems in these zones are characterised by crop diversification. However, these areas have become vulnerable to the effects of climate change. To understand these effects in relation to the population living in these areas, a study was carried out in the zone of Taliouine, in the Anti-Atlas. The vulnerability of crop productions to climate change was analysed and the different ways of adaptation adopted by farmers were identified. The work was done on saffron, the most profitable crop in the target area even though it requires much water. Our results show that the majority of the farmers surveyed had noticed variations in the climate of the region: irregularity of precipitation leading to a decrease in quantity and an uneven distribution throughout the year; rise in temperature; reduction in the cold period and less snow. These variations had impacts on the cropping system of saffron and its productivity. To cope with these effects, the farmers adopted various strategies: better management and use of water; diversification of agricultural activities; increase in the contribution of non-agricultural activities to their gross income; and seasonal migration.Keywords: climate change, Taliouine, saffron, perceptions, adaptation strategies
Procedia PDF Downloads 65