Search results for: sesimic data processing
26713 Water Demand Modelling Using Artificial Neural Network in Ramallah
Authors: F. Massri, M. Shkarneh, B. Almassri
Abstract:
Water scarcity and increasing water demand especially for residential use are major challenges facing Palestine. The need to accurately forecast water consumption is useful for the planning and management of this natural resource. The main objective of this paper is to (i) study the major factors influencing the water consumption in Palestine, (ii) understand the general pattern of Household water consumption, (iii) assess the possible changes in household water consumption and suggest appropriate remedies and (iv) develop prediction model based on the Artificial Neural Network to the water consumption in Palestinian cities. The paper is organized in four parts. The first part includes literature review of household water consumption studies. The second part concerns data collection methodology, conceptual frame work for the household water consumption surveys, survey descriptions and data processing methods. The third part presents descriptive statistics, multiple regression and analysis of the water consumption in the two Palestinian cities. The final part develops the use of Artificial Neural Network for modeling the water consumption in Palestinian cities.Keywords: water management, demand forecasting, consumption, ANN, Ramallah
Procedia PDF Downloads 21926712 Ship Detection Requirements Analysis for Different Sea States: Validation on Real SAR Data
Authors: Jaime Martín-de-Nicolás, David Mata-Moya, Nerea del-Rey-Maestre, Pedro Gómez-del-Hoyo, María-Pilar Jarabo-Amores
Abstract:
Ship detection is nowadays quite an important issue in tasks related to sea traffic control, fishery management and ship search and rescue. Although it has traditionally been carried out by patrol ships or aircrafts, coverage and weather conditions and sea state can become a problem. Synthetic aperture radars can surpass these coverage limitations and work under any climatological condition. A fast CFAR ship detector based on a robust statistical modeling of sea clutter with respect to sea states in SAR images is used. In this paper, the minimum SNR required to obtain a given detection probability with a given false alarm rate for any sea state is determined. A Gaussian target model using real SAR data is considered. Results show that SNR does not depend heavily on the class considered. Provided there is some variation in the backscattering of targets in SAR imagery, the detection probability is limited and a post-processing stage based on morphology would be suitable.Keywords: SAR, generalized gamma distribution, detection curves, radar detection
Procedia PDF Downloads 45326711 2D-Modeling with Lego Mindstorms
Authors: Miroslav Popelka, Jakub Nozicka
Abstract:
The whole work is based on possibility to use Lego Mindstorms robotics systems to reduce costs. Lego Mindstorms consists of a wide variety of hardware components necessary to simulate, programme and test of robotics systems in practice. To programme algorithm, which simulates space using the ultrasonic sensor, was used development environment supplied with kit. Software Matlab was used to render values afterwards they were measured by ultrasonic sensor. The algorithm created for this paper uses theoretical knowledge from area of signal processing. Data being processed by algorithm are collected by ultrasonic sensor that scans 2D space in front of it. Ultrasonic sensor is placed on moving arm of robot which provides horizontal moving of sensor. Vertical movement of sensor is provided by wheel drive. The robot follows map in order to get correct positioning of measured data. Based on discovered facts it is possible to consider Lego Mindstorm for low-cost and capable kit for real-time modelling.Keywords: LEGO Mindstorms, ultrasonic sensor, real-time modeling, 2D object, low-cost robotics systems, sensors, Matlab, EV3 Home Edition Software
Procedia PDF Downloads 47326710 Constructing a Semi-Supervised Model for Network Intrusion Detection
Authors: Tigabu Dagne Akal
Abstract:
While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.Keywords: intrusion detection, data mining, computer science, data mining
Procedia PDF Downloads 29726709 Harnessing the Benefits and Mitigating the Challenges of Neurosensitivity for Learners: A Mixed Methods Study
Authors: Kaaryn Cater
Abstract:
People vary in how they perceive, process, and react to internal, external, social, and emotional environmental factors; some are more sensitive than others. Compassionate people have a highly reactive nervous system and are more impacted by positive and negative environmental conditions (Differential Susceptibility). Further, some sensitive individuals are disproportionately able to benefit from positive and supportive environments without necessarily suffering negative impacts in less supportive environments (Vantage Sensitivity). Environmental sensitivity is underpinned by physiological, genetic, and personality/temperamental factors, and the phenotypic expression of high sensitivity is Sensory Processing Sensitivity. The hallmarks of Sensory Processing Sensitivity are deep cognitive processing, emotional reactivity, high levels of empathy, noticing environmental subtleties, a tendency to observe new and novel situations, and a propensity to become overwhelmed when over-stimulated. Several educational advantages associated with high sensitivity include creativity, enhanced memory, divergent thinking, giftedness, and metacognitive monitoring. High sensitivity can also lead to some educational challenges, particularly managing multiple conflicting demands and negotiating low sensory thresholds. A mixed methods study was undertaken. In the first quantitative study, participants completed the Perceived Success in Study Survey (PSISS) and the Highly Sensitive Person Scale (HSPS-12). Inclusion criteria were current or previous postsecondary education experience. The survey was presented on social media, and snowball recruitment was employed (n=365). The Excel spreadsheets were uploaded to the statistical package for the social sciences (SPSS)26, and descriptive statistics found normal distribution. T-tests and analysis of variance (ANOVA) calculations found no difference in the responses of demographic groups, and Principal Components Analysis and the posthoc Tukey calculations identified positive associations between high sensitivity and three of the five PSISS factors. Further ANOVA calculations found positive associations between the PSISS and two of the three sensitivity subscales. This study included a response field to register interest in further research. Respondents who scored in the 70th percentile on the HSPS-12 were invited to participate in a semi-structured interview. Thirteen interviews were conducted remotely (12 female). Reflexive inductive thematic analysis was employed to analyse data, and a descriptive approach was employed to present data reflective of participant experience. The results of this study found that compassionate students prioritize work-life balance; employ a range of practical metacognitive study and self-care strategies; value independent learning; connect with learning that is meaningful; and are bothered by aspects of the physical learning environment, including lighting, noise, and indoor environmental pollutants. There is a dearth of research investigating sensitivity in the educational context, and these studies highlight the need to promote widespread education sector awareness of environmental sensitivity, and the need to include sensitivity in sector and institutional diversity and inclusion initiatives.Keywords: differential susceptibility, highly sensitive person, learning, neurosensitivity, sensory processing sensitivity, vantage sensitivity
Procedia PDF Downloads 6626708 Algorithms used in Spatial Data Mining GIS
Authors: Vahid Bairami Rad
Abstract:
Extracting knowledge from spatial data like GIS data is important to reduce the data and extract information. Therefore, the development of new techniques and tools that support the human in transforming data into useful knowledge has been the focus of the relatively new and interdisciplinary research area ‘knowledge discovery in databases’. Thus, we introduce a set of database primitives or basic operations for spatial data mining which are sufficient to express most of the spatial data mining algorithms from the literature. This approach has several advantages. Similar to the relational standard language SQL, the use of standard primitives will speed-up the development of new data mining algorithms and will also make them more portable. We introduced a database-oriented framework for spatial data mining which is based on the concepts of neighborhood graphs and paths. A small set of basic operations on these graphs and paths were defined as database primitives for spatial data mining. Furthermore, techniques to efficiently support the database primitives by a commercial DBMS were presented.Keywords: spatial data base, knowledge discovery database, data mining, spatial relationship, predictive data mining
Procedia PDF Downloads 46226707 A Cognitive Training Program in Learning Disability: A Program Evaluation and Follow-Up Study
Authors: Krisztina Bohacs, Klaudia Markus
Abstract:
To author’s best knowledge we are in absence of studies on cognitive program evaluation and we are certainly short of programs that prove to have high effect sizes with strong retention results. The purpose of our study was to investigate the effectiveness of a comprehensive cognitive training program, namely BrainRx. This cognitive rehabilitation program target and remediate seven core cognitive skills and related systems of sub-skills through repeated engagement in game-like mental procedures delivered one-on-one by a clinician, supplemented by digital training. A larger sample of children with learning disability were given pretest and post-test cognitive assessments. The experimental group completed a twenty-week cognitive training program in a BrainRx center. A matched control group received another twenty-week intervention with Feuerstein’s Instrumental Enrichment programs. A second matched control group did not receive training. As for pre- and post-test, we used a general intelligence test to assess IQ and a computer-based test battery for assessing cognition across the lifespan. Multiple regression analyses indicated that the experimental BrainRx treatment group had statistically significant higher outcomes in attention, working memory, processing speed, logic and reasoning, auditory processing, visual processing and long-term memory compared to the non-treatment control group with very large effect sizes. With the exception of logic and reasoning, the BrainRx treatment group realized significantly greater gains in six of the above given seven cognitive measures compared to the Feuerstein control group. Our one-year retention measures showed that all the cognitive training gains were above ninety percent with the greatest retention skills in visual processing, auditory processing, logic, and reasoning. The BrainRx program may be an effective tool to establish long-term cognitive changes in case of students with learning disabilities. Recommendations are made for treatment centers and special education institutions on the cognitive training of students with special needs. The importance of our study is that targeted, systematic, progressively loaded and intensive brain training approach may significantly change learning disabilities.Keywords: cognitive rehabilitation training, cognitive skills, learning disability, permanent structural cognitive changes
Procedia PDF Downloads 20226706 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia
Authors: Nguyen-Thanh Son
Abstract:
Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.Keywords: MODIS, flood, mapping, Cambodia
Procedia PDF Downloads 12826705 Physical Properties of Nine Nigerian Staple Food Flours Related to Bulk Handling and Processing
Authors: Ogunsina Babatunde, Aregbesola Omotayo, Adebayo Adewale, Odunlami Johnson
Abstract:
The physical properties of nine Nigerian staple food flours related to bulk handling and processing were investigated following standard procedures. The results showed that the moisture content, bulk density, angle of repose, water absorption capacity, swelling index, dispersability, pH and wettability of the flours ranged from 9.95 to 11.98%, 0.44 to 0.66 g/cm3, 31.43 to 39.65o, 198.3 to 291.7 g of water/100 g of sample, 5.53 to 7.63, 60.3 to 73.8%, 4.43 to 6.70, and 11 to 150 s. The particle size analysis of the flour samples indicated significant differences (p<0.05). The least gelation concentration of the flour samples ranged from 6 to 14%. The colour of the flours fell between light and saturated, with the exception of cassava, millet and maize flours which appear dark and dull. The properties of food flours depend largely on the inherent property of the food material and may influence their functional behaviour as food materials.Keywords: properties, flours, staple food, bulk handling
Procedia PDF Downloads 48226704 CRM Cloud Computing: An Efficient and Cost Effective Tool to Improve Customer Interactions
Authors: Gaurangi Saxena, Ravindra Saxena
Abstract:
Lately, cloud computing is used to enhance the ability to attain corporate goals more effectively and efficiently at lower cost. This new computing paradigm “The Cloud Computing” has emerged as a powerful tool for optimum utilization of resources and gaining competitiveness through cost reduction and achieving business goals with greater flexibility. Realizing the importance of this new technique, most of the well known companies in computer industry like Microsoft, IBM, Google and Apple are spending millions of dollars in researching cloud computing and investigating the possibility of producing interface hardware for cloud computing systems. It is believed that by using the right middleware, a cloud computing system can execute all the programs a normal computer could run. Potentially, everything from most simple generic word processing software to highly specialized and customized programs designed for specific company could work successfully on a cloud computing system. A Cloud is a pool of virtualized computer resources. Clouds are not limited to grid environments, but also support “interactive user-facing applications” such as web applications and three-tier architectures. Cloud Computing is not a fundamentally new paradigm. It draws on existing technologies and approaches, such as utility Computing, Software-as-a-service, distributed computing, and centralized data centers. Some companies rent physical space to store servers and databases because they don’t have it available on site. Cloud computing gives these companies the option of storing data on someone else’s hardware, removing the need for physical space on the front end. Prominent service providers like Amazon, Google, SUN, IBM, Oracle, Salesforce etc. are extending computing infrastructures and platforms as a core for providing top-level services for computation, storage, database and applications. Application services could be email, office applications, finance, video, audio and data processing. By using cloud computing system a company can improve its customer relationship management. A CRM cloud computing system may be highly useful in delivering a sales team a blend of unique functionalities to improve agent/customer interactions. This paper attempts to first define the cloud computing as a tool for running business activities more effectively and efficiently at a lower cost; and then it distinguishes cloud computing with grid computing. Based on exhaustive literature review, authors discuss application of cloud computing in different disciplines of management especially in the field of marketing with special reference to use of cloud computing in CRM. Study concludes that CRM cloud computing platform helps a company track any data, such as orders, discounts, references, competitors and many more. By using CRM cloud computing, companies can improve its customer interactions and by serving them more efficiently that too at a lower cost can help gaining competitive advantage.Keywords: cloud computing, competitive advantage, customer relationship management, grid computing
Procedia PDF Downloads 31226703 Data Stream Association Rule Mining with Cloud Computing
Authors: B. Suraj Aravind, M. H. M. Krishna Prasad
Abstract:
There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.Keywords: data stream, association rule mining, cloud computing, frequent itemsets
Procedia PDF Downloads 50326702 Development of Mobile Application for Internship Program Management Using the Concept of Model View Controller (MVC) Pattern
Authors: Shutchapol Chopvitayakun
Abstract:
Nowadays, especially for the last 5 years, mobile devices, mobile applications and mobile users, through the deployment of wireless communication and mobile phone cellular network, all these components are growing significantly bigger and stronger. They are being integrated into each other to create multiple purposes and pervasive deployments into every business and non-business sector such as education, medicine, traveling, finance, real estate and many more. Objective of this study was to develop a mobile application for seniors or last-year students who enroll the internship program at each tertiary school (undergraduate school) and do onsite practice at real field sties, real organizations and real workspaces. During the internship session, all students as the interns are required to exercise, drilling and training onsite with specific locations and specific tasks or may be some assignments from their supervisor. Their work spaces are both private and government corporates and enterprises. This mobile application is developed under schema of a transactional processing system that enables users to keep daily work or practice log, monitor true working locations and ability to follow daily tasks of each trainee. Moreover, it provides useful guidance from each intern’s advisor, in case of emergency. Finally, it can summarize all transactional data then calculate each internship cumulated hours from the field practice session for each individual intern.Keywords: internship, mobile application, Android OS, smart phone devices, mobile transactional processing system, guidance and monitoring, tertiary education, senior students, model view controller (MVC)
Procedia PDF Downloads 31526701 Potential of Safflower (Carthamus tinctorius L.) for Phytoremediation of Soils Contaminated with Heavy Metals
Authors: Violina R. Angelova, Vanja I. Akova, Stefan V. Krustev, Krasimir I. Ivanov
Abstract:
A field study was conducted to evaluate the efficacy of safflower plant for phytoremediation of contaminated soils. The experiment was performed on an agricultural fields contaminated by the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. The concentrations of Pb, Zn and Cd in safflower (roots, stems, leaves and seeds), safflower oil and meal were determined. A correlation was found between the quantity of the mobile forms and the uptake of Pb, Zn and Cd by the safflower seeds. Safflower is a plant which is tolerant to heavy metals and can be grown on contaminated soils, and which can be referred to the hyperaccumulators of cadmium and the accumulators of lead and zinc, and can be successfully used in the phytoremediation of heavy metal contaminated soils. The processing of seeds to oil and using the obtained oil for nutritional purposes will greatly reduce the cost of phytoremediation. The possibility of further industrial processing will make safflower economically interesting crops for farmers of phytoremediation technology.Keywords: heavy metals, phytoremediation, polluted soils, safflower
Procedia PDF Downloads 32026700 Strategies of Risk Management for Smallholder Farmers in South Africa: A Case Study on Pigeonpea (Cajanus cajan) Production
Authors: Sanari Chalin Moriri, Kwabena Kingsley Ayisi, Alina Mofokeng
Abstract:
Dryland smallholder farmers in South Africa are vulnerable to all kinds of risks, and it negatively affects crop productivity and profit. Pigeonpea is a leguminous and multipurpose crop that provides food, fodder, and wood for smallholder farmers. The majority of these farmers are still growing pigeonpea from traditional unimproved seeds, which comprise a mixture of genotypes. The objectives of the study were to identify the key risk factors that affect pigeonpea productivity and to develop management strategies on how to alleviate the risk factors in pigeonpea production. The study was conducted in two provinces (Limpopo and Mpumalanga) of South Africa in six municipalities during the 2020/2021 growing seasons. The non-probability sampling method using purposive and snowball sampling techniques were used to collect data from the farmers through a structured questionnaire. A total of 114 pigeonpea producers were interviewed individually using a questionnaire. Key stakeholders in each municipality were also identified, invited, and interviewed to verify the information given by farmers. Data collected were subjected to SPSS statistical software 25 version. The findings of the study were that majority of farmers affected by risk factors were women, subsistence, and old farmers resulted in low food production. Drought, unavailability of improved pigeonpea seeds for planting, access to information, and processing equipment were found to be the main risk factors contributing to low crop productivity in farmer’s fields. Above 80% of farmers lack knowledge on the improvement of the crop and also on the processing techniques to secure high prices during the crop off-season. Market availability, pricing, and incidence of pests and diseases were found to be minor risk factors which were triggered by the major risk factors. The minor risk factors can be corrected only if the major risk factors are first given the necessary attention. About 10% of the farmers found to use the crop as a mulch to reduce soil temperatures and to improve soil fertility. The study revealed that most of the farmers were unaware of its utilisation as fodder, much, medicinal, nitrogen fixation, and many more. The risk of frequent drought in dry areas of South Africa where farmers solely depend on rainfall poses a serious threat to crop productivity. The majority of these risk factors are caused by climate change due to unrealistic, low rainfall with extreme temperatures poses a threat to food security, water, and the environment. The use of drought-tolerant, multipurpose legume crops such as pigeonpea, access to new information, provision of processing equipment, and support from all stakeholders will help in addressing food security for smallholder farmers. Policies should be revisited to address the prevailing risk factors faced by farmers and involve them in addressing the risk factors. Awareness should be prioritized in promoting the crop to improve its production and commercialization in the dryland farming system of South Africa.Keywords: management strategies, pigeonpea, risk factors, smallholder farmers
Procedia PDF Downloads 21326699 Novel Coprocessor for DNA Sequence Alignment in Resequencing Applications
Authors: Atef Ibrahim, Hamed Elsimary, Abdullah Aljumah, Fayez Gebali
Abstract:
This paper presents a novel semi-systolic array architecture for an optimized parallel sequence alignment algorithm. This architecture has the advantage that it can be modified to be reused for multiple pass processing in order to increase the number of processing elements that can be packed into a single FPGA and to increase the number of sequences that can be aligned in parallel in a single FPGA. This resolves the potential problem of many FPGA resources left unused for designs that have large values of short read length. When using the previously published conventional hardware design. FPGA implementation results show that, for large values of short read lengths (M>128), the proposed design has a slightly higher speed up and FPGA utilization over the the conventional one.Keywords: bioinformatics, genome sequence alignment, re-sequencing applications, systolic array
Procedia PDF Downloads 53226698 Shock and Particle Velocity Determination from Microwave Interrogation
Authors: Benoit Rougier, Alexandre Lefrancois, Herve Aubert
Abstract:
Microwave interrogation in the range 10-100 GHz is identified as an advanced technique to investigate simultaneously shock and particle velocity measurements. However, it requires the understanding of electromagnetic wave propagation in a multi-layered moving media. The existing models limit their approach to wave guides or evaluate the velocities with a fitting method, restricting therefore the domain of validity and the precision of the results. Moreover, few data of permittivity on high explosives at these frequencies under dynamic compression have been reported. In this paper, shock and particle velocities are computed concurrently for steady and unsteady shocks for various inert and reactive materials, via a propagation model based on Doppler shifts and signal amplitude. Refractive index of the material under compression is also calculated. From experimental data processing, it is demonstrated that Hugoniot curve can be evaluated. The comparison with published results proves the accuracy of the proposed method. This microwave interrogation technique seems promising for shock and detonation waves studies.Keywords: electromagnetic propagation, experimental setup, Hugoniot measurement, shock propagation
Procedia PDF Downloads 21326697 Quranic Recitation Listening Relate to Memory Processing, Language Selectivity and Attentional Process
Authors: Samhani Ismail, Tahamina Begum, Faruque Reza, Zamzuri Idris, Hafizan Juahir, Jafri Malin Abdullah
Abstract:
Holy Quran, a rhymed prosed scripture has a complete literary structure that exemplifies the peak of literary beauty. Memorizing of its verses could enhance one’s memory capacity and cognition while those who are listening to its recitation it is also believed that the Holy Quran alter brainwave producing neuronal excitation engaging with cognitive processes. 28 normal healthy subjects (male =14 & female = 14) were recruited and EEG recording was done using 128-electrode sensor net (Electrical Geosics, Inc.) with the impedance of ≤ 50kΩ. They listened to Sura Fatiha recited by Sheikh Qari Abdul Basit bin Abdus Samad. Arabic news and no sound were chosen as positive and negative control, respectively. The waveform was analysed by Fast Fourier Transform (FFT) to get the power in frequency bands. Bilateral frontal (F7, F8) and temporal region (T7, T8) showed decreased power significantly in alpha wave band in respondent stimulated by Sura Fatihah recitation reflects acoustic attention processing. However, decreased in alpha power in selective attention to memorized, and in familial but not memorized language, reveals the memorial processing in long-term memory. As a conclusion, Quranic recitation relates both cognitive element of memory and language in its listeners and memorizers.Keywords: auditory stimulation, cognition, EEG, linguistic, memory, Quranic recitation
Procedia PDF Downloads 34226696 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques
Authors: Tosin Ige
Abstract:
Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique
Procedia PDF Downloads 17326695 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator
Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani
Abstract:
During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA
Procedia PDF Downloads 18626694 Health Burden of Disease Assessment for Minimizing Aflatoxin Exposure in Peanuts
Authors: Min-Pei Ling
Abstract:
Aflatoxin is a fungal secondary metabolite with high toxicity capable of contaminating various types of food crops. It has been identified as a Group 1 human carcinogen by the International Agency for Research on Cancer. Chronic aflatoxin exposure has caused a worldwide public food safety concern. Peanuts and peanut products are the major sources of aflatoxin exposure. Therefore, some reduction interventions have been developed to minimize contamination through the peanut production chain. The purpose of this study is to estimate the efficacy of interventions in reducing the health impact of hepatocellular carcinoma caused by aflatoxin contamination in peanuts. The estimated total disability-adjusted life-years (DALYs) was calculated using FDA-iRISK online software. Six aflatoxin reduction strategies were evaluated, including good agricultural practice (GAP), biocontrol, Purdue Improved Crop Storage packaging, basic processing, ozonolysis, and ultraviolet irradiation. The results indicated that basic processing could prevent huge public health loss of 4,079.7–21,833 total DALYs per year, which accounted for 39.6% of all decreased total DALYs. GAP and biocontrol were both effective strategies in the farm field, while the other three interventions were limited in reducing total DALYs. In conclusion, this study could help farmers, processing plants, and government policymakers to alleviate aflatoxin contamination issues in the peanut production chain.Keywords: aflatoxin, health burden, disability-adjusted life-years, peanuts
Procedia PDF Downloads 13526693 Mechanism of Action of New Sustainable Flame Retardant Additives in Polyamide 6,6
Authors: I. Belyamani, M. K. Hassan, J. U. Otaigbe, W. R. Fielding, K. A. Mauritz, J. S. Wiggins, W. L. Jarrett
Abstract:
We have investigated the flame-retardant efficiency of special new phosphate glass (P-glass) compositions having different glass transition temperatures (Tg) on the processing conditions of polyamide 6,6 (PA6,6) and the final hybrid flame retardancy (FR). We have showed that the low Tg P glass composition (i.e., ILT 1) is a promising flame retardant for PA6,6 at a concentration of up to 15 wt. % compared to intermediate (IIT 3) and high (IHT 1) Tg P glasses. Cone calorimetry data showed that the ILT 1 decreased both the peak heat release rate and the total heat amount released from the PA6,6/ILT 1 hybrids, resulting in an efficient formation of a glassy char layer. These intriguing findings prompted to address several questions concerning the mechanism of action of the different P glasses studied. The general mechanism of action of phosphorous based FR additives occurs during the combustion stage by enhancing the morphology of the char and the thermal shielding effect. However, the present work shows that P glass based FR additives act during melt processing of PA6,6/P glass hybrids. Dynamic mechanical analysis (DMA) revealed that the Tg of PA6,6/ILT 1 was significantly shifted to a lower Tg (~65 oC) and another transition appeared at high temperature (~ 166 oC), thus indicating a strong interaction between PA6,6 and ILT 1. This was supported by a drop in the melting point and crystallinity of the PA6,6/ILT 1 hybrid material as detected by differential scanning calorimetry (DSC). The dielectric spectroscopic investigation of the networks’ molecular level structural variations (i.e. hybrids chain motion, Tg and sub-Tg relaxations) agreed very well with the DMA and DSC findings; it was found that the three different P glass compositions did not show any effect on the PA6,6 sub-Tg relaxations (related to the NH2 and OH chain end groups motions). Nevertheless, contrary to IIT 3 and IHT 1 based hybrids, the PA6,6/ILT 1 hybrid material showed an evidence of splitting the PA6,6 Tg relaxations into two peaks. Finally, the CPMAS 31P-NMR data confirmed the miscibility between ILT 1 and PA6,6 at the molecular level, as a much larger enhancement in cross-polarization for the PA6,6/15%ILT 1 hybrids was observed. It can be concluded that compounding low Tg P-glass (i.e. ILT 1) with PA6,6 facilitates hydrolytic chain scission of the PA6,6 macromolecules through a potential chemical interaction between phosphate and the alpha-Carbon of the amide bonds of the PA6,6, leading to better flame retardant properties.Keywords: broadband dielectric spectroscopy, composites, flame retardant, polyamide, phosphate glass, sustainable
Procedia PDF Downloads 23926692 Construction of Graph Signal Modulations via Graph Fourier Transform and Its Applications
Authors: Xianwei Zheng, Yuan Yan Tang
Abstract:
Classical window Fourier transform has been widely used in signal processing, image processing, machine learning and pattern recognition. The related Gabor transform is powerful enough to capture the texture information of any given dataset. Recently, in the emerging field of graph signal processing, researchers devoting themselves to develop a graph signal processing theory to handle the so-called graph signals. Among the new developing theory, windowed graph Fourier transform has been constructed to establish a time-frequency analysis framework of graph signals. The windowed graph Fourier transform is defined by using the translation and modulation operators of graph signals, following the similar calculations in classical windowed Fourier transform. Specifically, the translation and modulation operators of graph signals are defined by using the Laplacian eigenvectors as follows. For a given graph signal, its translation is defined by a similar manner as its definition in classical signal processing. Specifically, the translation operator can be defined by using the Fourier atoms; the graph signal translation is defined similarly by using the Laplacian eigenvectors. The modulation of the graph can also be established by using the Laplacian eigenvectors. The windowed graph Fourier transform based on these two operators has been applied to obtain time-frequency representations of graph signals. Fundamentally, the modulation operator is defined similarly to the classical modulation by multiplying a graph signal with the entries in each Fourier atom. However, a single Laplacian eigenvector entry cannot play a similar role as the Fourier atom. This definition ignored the relationship between the translation and modulation operators. In this paper, a new definition of the modulation operator is proposed and thus another time-frequency framework for graph signal is constructed. Specifically, the relationship between the translation and modulation operations can be established by the Fourier transform. Specifically, for any signal, the Fourier transform of its translation is the modulation of its Fourier transform. Thus, the modulation of any signal can be defined as the inverse Fourier transform of the translation of its Fourier transform. Therefore, similarly, the graph modulation of any graph signal can be defined as the inverse graph Fourier transform of the translation of its graph Fourier. The novel definition of the graph modulation operator established a relationship of the translation and modulation operations. The new modulation operation and the original translation operation are applied to construct a new framework of graph signal time-frequency analysis. Furthermore, a windowed graph Fourier frame theory is developed. Necessary and sufficient conditions for constructing windowed graph Fourier frames, tight frames and dual frames are presented in this paper. The novel graph signal time-frequency analysis framework is applied to signals defined on well-known graphs, e.g. Minnesota road graph and random graphs. Experimental results show that the novel framework captures new features of graph signals.Keywords: graph signals, windowed graph Fourier transform, windowed graph Fourier frames, vertex frequency analysis
Procedia PDF Downloads 34226691 Construction of Finite Woven Frames through Bounded Linear Operators
Authors: A. Bhandari, S. Mukherjee
Abstract:
Two frames in a Hilbert space are called woven or weaving if all possible merge combinations between them generate frames of the Hilbert space with uniform frame bounds. Weaving frames are powerful tools in wireless sensor networks which require distributed data processing. Considering the practical applications, this article deals with finite woven frames. We provide methods of constructing finite woven frames, in particular, bounded linear operators are used to construct woven frames from a given frame. Several examples are discussed. We also introduce the notion of woven frame sequences and characterize them through the concepts of gaps and angles between spaces.Keywords: frames, woven frames, gap, angle
Procedia PDF Downloads 19526690 Multi-scale Spatial and Unified Temporal Feature-fusion Network for Multivariate Time Series Anomaly Detection
Authors: Hang Yang, Jichao Li, Kewei Yang, Tianyang Lei
Abstract:
Multivariate time series anomaly detection is a significant research topic in the field of data mining, encompassing a wide range of applications across various industrial sectors such as traffic roads, financial logistics, and corporate production. The inherent spatial dependencies and temporal characteristics present in multivariate time series introduce challenges to the anomaly detection task. Previous studies have typically been based on the assumption that all variables belong to the same spatial hierarchy, neglecting the multi-level spatial relationships. To address this challenge, this paper proposes a multi-scale spatial and unified temporal feature fusion network, denoted as MSUT-Net, for multivariate time series anomaly detection. The proposed model employs a multi-level modeling approach, incorporating both temporal and spatial modules. The spatial module is designed to capture the spatial characteristics of multivariate time series data, utilizing an adaptive graph structure learning model to identify the multi-level spatial relationships between data variables and their attributes. The temporal module consists of a unified temporal processing module, which is tasked with capturing the temporal features of multivariate time series. This module is capable of simultaneously identifying temporal dependencies among different variables. Extensive testing on multiple publicly available datasets confirms that MSUT-Net achieves superior performance on the majority of datasets. Our method is able to model and accurately detect systems data with multi-level spatial relationships from a spatial-temporal perspective, providing a novel perspective for anomaly detection analysis.Keywords: data mining, industrial system, multivariate time series, anomaly detection
Procedia PDF Downloads 1726689 Design and Implementation of Collaborative Editing System Based on Physical Simulation Engine Running State
Authors: Zhang Songning, Guan Zheng, Ci Yan, Ding Gangyi
Abstract:
The application of physical simulation engines in collaborative editing systems has an important background and role. Firstly, physical simulation engines can provide real-world physical simulations, enabling users to interact and collaborate in real time in virtual environments. This provides a more intuitive and immersive experience for collaborative editing systems, allowing users to more accurately perceive and understand various elements and operations in collaborative editing. Secondly, through physical simulation engines, different users can share virtual space and perform real-time collaborative editing within it. This real-time sharing and collaborative editing method helps to synchronize information among team members and improve the efficiency of collaborative work. Through experiments, the average model transmission speed of a single person in the collaborative editing system has increased by 141.91%; the average model processing speed of a single person has increased by 134.2%; the average processing flow rate of a single person has increased by 175.19%; the overall efficiency improvement rate of a single person has increased by 150.43%. With the increase in the number of users, the overall efficiency remains stable, and the physical simulation engine running status collaborative editing system also has horizontal scalability. It is not difficult to see that the design and implementation of a collaborative editing system based on physical simulation engines not only enriches the user experience but also optimizes the effectiveness of team collaboration, providing new possibilities for collaborative work.Keywords: physics engine, simulation technology, collaborative editing, system design, data transmission
Procedia PDF Downloads 8826688 Possibility of Creating Polygon Layers from Raster Layers Obtained by using Classic Image Processing Software: Case of Geological Map of Rwanda
Authors: Louis Nahimana
Abstract:
Most maps are in a raster or pdf format and it is not easy to get vector layers of published maps. Faced to the production of geological simplified map of the northern Lake Tanganyika countries without geological information in vector format, I tried a method of obtaining vector layers from raster layers created from geological maps of Rwanda and DR Congo in pdf and jpg format. The procedure was as follows: The original raster maps were georeferenced using ArcGIS10.2. Under Adobe Photoshop, map areas with the same color corresponding to a lithostratigraphic unit were selected all over the map and saved in a specific raster layer. Using the same image processing software Adobe Photoshop, each RGB raster layer was converted in grayscale type and improved before importation in ArcGIS10. After georeferencing, each lithostratigraphic raster layer was transformed into a multitude of polygons with the tool "Raster to Polygon (Conversion)". Thereafter, tool "Aggregate Polygons (Cartography)" allowed obtaining a single polygon layer. Repeating the same steps for each color corresponding to a homogeneous rock unit, it was possible to reconstruct the simplified geological constitution of Rwanda and the Democratic Republic of Congo in vector format. By using the tool «Append (Management)», vector layers obtained were combined with those from Burundi to achieve vector layers of the geology of the « Northern Lake Tanganyika countries ».Keywords: creating raster layer under image processing software, raster to polygon, aggregate polygons, adobe photoshop
Procedia PDF Downloads 44426687 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 41826686 Knowledge Management to Develop the Graduate Study Programs
Authors: Chuen-arom Janthimachai-amorn, Chirawadee Harnrittha
Abstract:
This study aims to identify the factors facilitating the knowledge management to develop the graduate study programs to achieve success and to identify the approaches in developing the graduate study programs in the Rajbhat Suansunantha University. The 10 respondents were the administrators, the faculty, and the personnel of its Graduate School. The research methodology was based on Pla-too Model of the Knowledge Management Institute (KMI) by allocating the knowledge indicators, the knowledge creation and search, knowledge systematization, knowledge processing and filtering, knowledge access, knowledge sharing and exchanges and learning. The results revealed that major success factors were knowledge indicators, evident knowledge management planning, knowledge exchange and strong solidarity of the team and systematic and tenacious access of knowledge. The approaches allowing the researchers to critically develop the graduate study programs were the environmental data analyses, the local needs and general situations, data analyses of the previous programs, cost analyses of the resources, and the identification of the structure and the purposes to develop the new programs.Keywords: program development, knowledge management, graduate study programs, Rajbhat Suansunantha University
Procedia PDF Downloads 30826685 Parallel Version of Reinhard’s Color Transfer Algorithm
Authors: Abhishek Bhardwaj, Manish Kumar Bajpai
Abstract:
An image with its content and schema of colors presents an effective mode of information sharing and processing. By changing its color schema different visions and prospect are discovered by the users. This phenomenon of color transfer is being used by Social media and other channel of entertainment. Reinhard et al’s algorithm was the first one to solve this problem of color transfer. In this paper, we make this algorithm efficient by introducing domain parallelism among different processors. We also comment on the factors that affect the speedup of this problem. In the end by analyzing the experimental data we claim to propose a novel and efficient parallel Reinhard’s algorithm.Keywords: Reinhard et al’s algorithm, color transferring, parallelism, speedup
Procedia PDF Downloads 61426684 Process Optimization for Albanian Crude Oil Characterization
Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici
Abstract:
Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.Keywords: TBP distillation curves, crude oil, optimization, simulation
Procedia PDF Downloads 305