Search results for: heterogeneous massive data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25138

Search results for: heterogeneous massive data

25048 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests

Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.

Keywords: heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation

Procedia PDF Downloads 268
25047 Effective Infection Control Measures to Prevent Transmission of Multi-Drug Resistant Organisms from Burn Transfer Cases in a Regional Burn Centre

Authors: Si Jack Chong, Chew Theng Yap, Wan Loong James Mok

Abstract:

Introduction: Regional burn centres face the spectra of introduced multi-drug resistant organisms (MDRO) from transfer patients resident in MDRO endemic countries. MDRO can cause severe nosocomial infection, which in massive burn patients, will lead to greater morbidity and mortality and strain the institution financially. We aim to highlight 4 key measures that have effectively prevented transmission of imported MDRO. Methods: A case of Candida auris (C. auris) from a massive burn patient transferred from an MDRO endemic country is used to illustrate the measures. C. auris is a globally emerging multi-drug resistant fungal pathogen causing nosocomial transmission. Results: Infection control measures used to mitigate the risk of outbreak from transfer cases are: (1) Multidisciplinary team approach involving Infection Control and Infectious Disease specialists early to ensure appropriate antibiotics use and implementation of barrier measures, (2) aseptic procedures for dressing change with strict isolation and donning of personal protective equipment in the ward, (3) early screening of massive burn patient from MDRO endemic region, (4) hydrogen peroxide vaporization terminal cleaning for operating theatres and rooms. Conclusion: The prevalence of air travel and international transfer to regional burn centres will need effective infection control measures to reduce the risk of transmission from imported massive burn patients. In our centre, we have effectively implemented 4 measures which have reduced the risks of local contamination. We share a recent case report to illustrate successful management of a potential MDRO outbreak resulting from transfer of massive burn patient resident in an MDRO endemic area.

Keywords: burns, burn unit, cross infection, infection control

Procedia PDF Downloads 128
25046 The Potential Impact of Big Data Analytics on Pharmaceutical Supply Chain Management

Authors: Maryam Ziaee, Himanshu Shee, Amrik Sohal

Abstract:

Big Data Analytics (BDA) in supply chain management has recently drawn the attention of academics and practitioners. Big data refers to a massive amount of data from different sources, in different formats, generated at high speed through transactions in business environments and supply chain networks. Traditional statistical tools and techniques find it difficult to analyse this massive data. BDA can assist organisations to capture, store, and analyse data specifically in the field of supply chain. Currently, there is a paucity of research on BDA in the pharmaceutical supply chain context. In this research, the Australian pharmaceutical supply chain was selected as the case study. This industry is highly significant since the right medicine must reach the right patients, at the right time, in right quantity, in good condition, and at the right price to save lives. However, drug shortages remain a substantial problem for hospitals across Australia with implications on patient care, staff resourcing, and expenditure. Furthermore, a massive volume and variety of data is generated at fast speed from multiple sources in pharmaceutical supply chain, which needs to be captured and analysed to benefit operational decisions at every stage of supply chain processes. As the pharmaceutical industry lags behind other industries in using BDA, it raises the question of whether the use of BDA can improve transparency among pharmaceutical supply chain by enabling the partners to make informed-decisions across their operational activities. This presentation explores the impacts of BDA on supply chain management. An exploratory qualitative approach was adopted to analyse data collected through interviews. This study also explores the BDA potential in the whole pharmaceutical supply chain rather than focusing on a single entity. Twenty semi-structured interviews were undertaken with top managers in fifteen organisations (five pharmaceutical manufacturers, five wholesalers/distributors, and five public hospital pharmacies) to investigate their views on the use of BDA. The findings revealed that BDA can enable pharmaceutical entities to have improved visibility over the whole supply chain and also the market; it enables entities, especially manufacturers, to monitor consumption and the demand rate in real-time and make accurate demand forecasts which reduce drug shortages. Timely and precise decision-making can allow the entities to source and manage their stocks more effectively. This can likely address the drug demand at hospitals and respond to unanticipated issues such as drug shortages. Earlier studies explore BDA in the context of clinical healthcare; however, this presentation investigates the benefits of BDA in the Australian pharmaceutical supply chain. Furthermore, this research enhances managers’ insight into the potentials of BDA at every stage of supply chain processes and helps to improve decision-making in their supply chain operations. The findings will turn the rhetoric of data-driven decision into a reality where the managers may opt for analytics for improved decision-making in the supply chain processes.

Keywords: big data analytics, data-driven decision, pharmaceutical industry, supply chain management

Procedia PDF Downloads 85
25045 Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses

Authors: Nabil Sultan

Abstract:

A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).

Keywords: MOOCs, disruptive innovations, higher education, jobs theory

Procedia PDF Downloads 245
25044 Heterogeneous Photocatalytic Degradation of Methylene Blue by Montmorillonite/CuxCd1-xs Nanomaterials

Authors: Horiya Boukhatem, Lila Djouadi, Hussein Khalaf, Rufino Manuel Navarro Yerga, Fernando Vaquero Gonzalez

Abstract:

Heterogeneous photo catalysis is an alternative method for the removal of organic pollutants in water. The photo excitation of a semi-conductor under ultra violet (UV) irradiation entails the production of hydroxyl radicals, one of the most oxidative chemical species. The objective of this study is the synthesis of nano materials based on montmorillonite and CuxCd1-xS with different Cu concentration (0.3 < x < 0.7) and their application in photocatalysis of a cationic dye: methylene blue. The synthesized nano materials and montmorillonite were characterized by fourier transform infrared (FTIR). Test results of photo catalysis of methylene blue under UV-Visible irradiation show that the photoactivity of nano materials montmorillonite/ CuxCd1-xS increase with the increasing of Cu concentration and it is significantly higher compared to that of sodium montmorillonite alone. The application of the kinetic model of Langmuir-Hinshelwood (L-H) to the photocatalytic test results showed that the reaction rate obeys to the first-order kinetic model.

Keywords: heterogeneous photo catalysis, methylene blue, montmorillonite, nano material

Procedia PDF Downloads 319
25043 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 403
25042 Algorithms for Run-Time Task Mapping in NoC-Based Heterogeneous MPSoCs

Authors: M. K. Benhaoua, A. K. Singh, A. E. Benyamina, P. Boulet

Abstract:

Mapping parallelized tasks of applications onto these MPSoCs can be done either at design time (static) or at run-time (dynamic). Static mapping strategies find the best placement of tasks at design-time, and hence, these are not suitable for dynamic workload and seem incapable of runtime resource management. The number of tasks or applications executing in MPSoC platform can exceed the available resources, requiring efficient run-time mapping strategies to meet these constraints. This paper describes a new Spiral Dynamic Task Mapping heuristic for mapping applications onto NoC-based Heterogeneous MPSoC. This heuristic is based on packing strategy and routing Algorithm proposed also in this paper. Heuristic try to map the tasks of an application in a clustering region to reduce the communication overhead between the communicating tasks. The heuristic proposed in this paper attempts to map the tasks of an application that are most related to each other in a spiral manner and to find the best possible path load that minimizes the communication overhead. In this context, we have realized a simulation environment for experimental evaluations to map applications with varying number of tasks onto an 8x8 NoC-based Heterogeneous MPSoCs platform, we demonstrate that the new mapping heuristics with the new modified dijkstra routing algorithm proposed are capable of reducing the total execution time and energy consumption of applications when compared to state-of-the-art run-time mapping heuristics reported in the literature.

Keywords: multiprocessor system on chip, MPSoC, network on chip, NoC, heterogeneous architectures, run-time mapping heuristics, routing algorithm

Procedia PDF Downloads 466
25041 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering

Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining

Abstract:

DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.

Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)

Procedia PDF Downloads 258
25040 Tracing Digital Traces of Phatic Communion in #Mooc

Authors: Judith Enriquez-Gibson

Abstract:

This paper meddles with the notion of phatic communion introduced 90 years ago by Malinowski, who was a Polish-born British anthropologist. It explores the phatic in Twitter within the contents of tweets related to moocs (massive online open courses) as a topic or trend. It is not about moocs though. It is about practices that could easily be hidden or neglected if we let big or massive topics take the lead or if we simply follow the computational or secret codes behind Twitter itself and third party software analytics. It draws from media and cultural studies. Though at first it appears data-driven as I submitted data collection and analytics into the hands of a third party software, Twitonomy, the aim is to follow how phatic communion might be practised in a social media site, such as Twitter. Lurking becomes its research method to analyse mooc-related tweets. A total of 3,000 tweets were collected on 11 October 2013 (UK timezone). The emphasis of lurking is to engage with Twitter as a system of connectivity. One interesting finding is that a click is in fact a phatic practice. A click breaks the silence. A click in one of the mooc website is actually a tweet. A tweet was posted on behalf of a user who simply chose to click without formulating the text and perhaps without knowing that it contains #mooc. Surely, this mechanism is not about reciprocity. To break the silence, users did not use words. They just clicked the ‘tweet button’ on a mooc website. A click performs and maintains connectivity – and Twitter as the medium in attendance in our everyday, available when needed to be of service. In conclusion, the phatic culture of breaking silence in Twitter does not have to submit to the power of code and analytics. It is a matter of human code.

Keywords: click, Twitter, phatic communion, social media data, mooc

Procedia PDF Downloads 389
25039 Optimization of Biodiesel Production from Palm Oil over Mg-Al Modified K-10 Clay Catalyst

Authors: Muhammad Ayoub, Abrar Inayat, Bhajan Lal, Sintayehu Mekuria Hailegiorgis

Abstract:

Biodiesel which comes from pure renewable resources provide an alternative fuel option for future because of limited fossil fuel resources as well as environmental concerns. The transesterification of vegetable oils for biodiesel production is a promising process to overcome this future crises of energy. The use of heterogeneous catalysts greatly simplifies the technological process by facilitating the separation of the post-reaction mixture. The purpose of the present work was to examine a heterogeneous catalyst, in particular, Mg-Al modified K-10 clay, to produce methyl esters of palm oil. The prepared catalyst was well characterized by different latest techniques. In this study, the transesterification of palm oil with methanol was studied in a heterogeneous system in the presence of Mg-Al modified K-10 clay as solid base catalyst and then optimized these results with the help of Design of Experiments software. The results showed that methanol is the best alcohol for this reaction condition. The best results was achieved for optimization of biodiesel process. The maximum conversion of triglyceride (88%) was noted after 8 h of reaction at 60 ̊C, with a 6:1 molar ratio of methanol to palm oil and 3 wt % of prepared catalyst.

Keywords: palm oil, transestrefication, clay, biodiesel, mesoporous clay, K-10

Procedia PDF Downloads 370
25038 Nazca: A Context-Based Matching Method for Searching Heterogeneous Structures

Authors: Karine B. de Oliveira, Carina F. Dorneles

Abstract:

The structure level matching is the problem of combining elements of a structure, which can be represented as entities, classes, XML elements, web forms, and so on. This is a challenge due to large number of distinct representations of semantically similar structures. This paper describes a structure-based matching method applied to search for different representations in data sources, considering the similarity between elements of two structures and the data source context. Using real data sources, we have conducted an experimental study comparing our approach with our baseline implementation and with another important schema matching approach. We demonstrate that our proposal reaches higher precision than the baseline.

Keywords: context, data source, index, matching, search, similarity, structure

Procedia PDF Downloads 338
25037 Multi-Perspective Learning in a Real Production Plant Using Experiential Learning in Heterogeneous Groups to Develop System Competencies for Production System Improvements

Authors: Marlies Achenbach

Abstract:

System competencies play a key role to ensure an effective and efficient improvement of production systems. Thus, there can be observed an increasing demand for developing system competencies in industry as well as in engineering education. System competencies consist of the following two main abilities: Evaluating the current state of a production system and developing a target state. The innovative course ‘multi-perspective learning in a real production plant (multi real)’ is developed to create a learning setting that supports the development of these system competencies. Therefore, the setting combines two innovative aspects: First, the Learning takes place in heterogeneous groups formed by students as well as professionals and managers from industry. Second, the learning takes place in a real production plant. This paper presents the innovative didactic concept of ‘multi real’ in detail, which will initially be implemented in October/November 2016 in the industrial engineering, logistics and mechanical master’s program at TU Dortmund University.

Keywords: experiential learning, heterogeneous groups, improving production systems, system competencies

Procedia PDF Downloads 400
25036 Machine Learning-Enabled Classification of Climbing Using Small Data

Authors: Nicholas Milburn, Yu Liang, Dalei Wu

Abstract:

Athlete performance scoring within the climbing do-main presents interesting challenges as the sport does not have an objective way to assign skill. Assessing skill levels within any sport is valuable as it can be used to mark progress while training, and it can help an athlete choose appropriate climbs to attempt. Machine learning-based methods are popular for complex problems like this. The dataset available was composed of dynamic force data recorded during climbing; however, this dataset came with challenges such as data scarcity, imbalance, and it was temporally heterogeneous. Investigated solutions to these challenges include data augmentation, temporal normalization, conversion of time series to the spectral domain, and cross validation strategies. The investigated solutions to the classification problem included light weight machine classifiers KNN and SVM as well as the deep learning with CNN. The best performing model had an 80% accuracy. In conclusion, there seems to be enough information within climbing force data to accurately categorize climbers by skill.

Keywords: classification, climbing, data imbalance, data scarcity, machine learning, time sequence

Procedia PDF Downloads 124
25035 The Effect of Subsurface Dam on Saltwater Intrusion in Heterogeneous Coastal Aquifers

Authors: Antoifi Abdoulhalik, Ashraf Ahmed

Abstract:

Saltwater intrusion (SWI) in coastal aquifers has become a growing threat for many countries around the world. While various control measures have been suggested to mitigate SWI, the construction of subsurface physical barriers remains one of the most effective solutions for this problem. In this work, we used laboratory experiments and numerical simulations to investigate the effectiveness of subsurface dams in heterogeneous layered coastal aquifer with different layering patterns. Four different cases were investigated, including a homogeneous (case H), and three heterogeneous cases in which a low permeability (K) layer was set in the top part of the system (case LH), in the middle part of the system (case HLH) and the bottom part of the system (case HL). Automated image analysis technique was implemented to quantify the main SWI parameters under high spatial and temporal resolution. The method also provides transient salt concentration maps, allowing for the first time clear visualization of the spillage of saline water over the dam (advancing wedge condition) as well as the flushing of residual saline water from the freshwater area (receding wedge condition). The SEAWAT code was adopted for the numerical simulations. The results show that the presence of an overlying layer of low permeability enhanced the ability of the dam to retain the saline water. In such conditions, the rate of saline water spillage and inland extension may considerably be reduced. Conversely, the presence of an underlying low K layer led to a faster increase of saltwater volume on the seaward side of the wall, therefore considerably facilitating the spillage. The results showed that a complete removal of the residual saline water eventually occurred in all the investigated scenarios, with a rate of removal strongly affected by the hydraulic conductivity of the lower part of the aquifer. The data showed that the addition of the underlying low K layer in case HL caused the complete flushing to be almost twice longer than in the homogeneous scenario.

Keywords: heterogeneous coastal aquifers, laboratory experiments, physical barriers, seawater intrusion control

Procedia PDF Downloads 224
25034 Generation of Automated Alarms for Plantwide Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.

Keywords: detection, monitoring, process data, noise

Procedia PDF Downloads 228
25033 Unified Structured Process for Health Analytics

Authors: Supunmali Ahangama, Danny Chiang Choon Poo

Abstract:

Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.

Keywords: agile methodology, health analytics, unified process model, UML

Procedia PDF Downloads 484
25032 Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule

Authors: Ming-Jong Yao, Chin-Sum Shui, Chih-Han Wang

Abstract:

This paper is developed based on a real-world decision scenario that an industrial gas company that applies the Vendor Managed Inventory model and supplies liquid oxygen with a self-operated heterogeneous vehicle fleet to hospitals in nearby cities. We name it as a Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule and formulate it as a non-linear mixed-integer linear programming problem which simultaneously determines the length of the planning cycle (PC), the length of the replenishment cycle and the dates of replenishment for each customer and the vehicle routes of each day within PC, such that the average daily operation cost within PC, including inventory holding cost, setup cost, transportation cost, and overtime labor cost, is minimized. A solution method based on genetic algorithm, embedded with an encoding and decoding mechanism and local search operators, is then proposed, and the hash function is adopted to avoid repetitive fitness evaluation for identical solutions. Numerical experiments demonstrate that the proposed solution method can effectively solve the problem under different lengths of PC and number of customers. The method is also shown to be effective in determining whether the company should expand the storage capacity of a customer whose demand increases. Sensitivity analysis of the vehicle fleet composition shows that deploying a mixed fleet can reduce the daily operating cost.

Keywords: cyclic inventory routing problem, joint replenishment, heterogeneous vehicle, genetic algorithm

Procedia PDF Downloads 58
25031 Lithium Ion Supported on TiO2 Mixed Metal Oxides as a Heterogeneous Catalyst for Biodiesel Production from Canola Oil

Authors: Mariam Alsharifi, Hussein Znad, Ming Ang

Abstract:

Considering the environmental issues and the shortage in the conventional fossil fuel sources, biodiesel has gained a promising solution to shift away from fossil based fuel as one of the sustainable and renewable energy. It is synthesized by transesterification of vegetable oils or animal fats with alcohol (methanol or ethanol) in the presence of a catalyst. This study focuses on synthesizing a high efficient Li/TiO2 heterogeneous catalyst for biodiesel production from canola oil. In this work, lithium immobilized onto TiO2 by the simple impregnation method. The catalyst was evaluated by transesterification reaction in a batch reactor under moderate reaction conditions. To study the effect of Li concentrations, a series of LiNO3 concentrations (20, 30, 40 wt. %) at different calcination temperatures (450, 600, 750 ºC) were evaluated. The Li/TiO2 catalysts are characterized by several spectroscopic and analytical techniques such as XRD, FT-IR, BET, TG-DSC and FESEM. The optimum values of impregnated Lithium nitrate on TiO2 and calcination temperature are 30 wt. % and 600 ºC, respectively, along with a high conversion to be 98 %. The XRD study revealed that the insertion of Li improved the catalyst efficiency without any alteration in structure of TiO2 The best performance of the catalyst was achieved when using a methanol to oil ratio of 24:1, 5 wt. % of catalyst loading, at 65◦C reaction temperature for 3 hours of reaction time. Moreover, the experimental kinetic data were compatible with the pseudo-first order model and the activation energy was (39.366) kJ/mol. The synthesized catalyst Li/TiO2 was applied to trans- esterify used cooking oil and exhibited a 91.73% conversion. The prepared catalyst has shown a high catalytic activity to produce biodiesel from fresh and used oil within mild reaction conditions.

Keywords: biodiesel, canola oil, environment, heterogeneous catalyst, impregnation method, renewable energy, transesterification

Procedia PDF Downloads 154
25030 Feasibility of Weakly Interacting Massive Particles as Dark Matter Candidates: Exploratory Study on The Possible Reasons for Lack of WIMP Detection

Authors: Sloka Bhushan

Abstract:

Dark matter constitutes a majority of matter in the universe, yet very little is known about it due to its extreme lack of interaction with regular matter and the fundamental forces. Weakly Interacting Massive Particles, or WIMPs, have been contested to be one of the strongest candidates for dark matter due to their promising theoretical properties. However, various endeavors to detect these elusive particles have failed. This paper explores the various particles which may be WIMPs and the detection techniques being employed to detect WIMPs (such as underground detectors, LHC experiments, and so on). There is a special focus on the reasons for the lack of detection of WIMPs so far, and the possibility of limits in detection being a reason for the lack of physical evidence of the existence of WIMPs. This paper also explores possible inconsistencies within the WIMP particle theory as a reason for the lack of physical detection. There is a brief review on the possible solutions and alternatives to these inconsistencies. Additionally, this paper also reviews the supersymmetry theory and the possibility of the supersymmetric neutralino (A possible WIMP particle) being detectable. Lastly, a review on alternate candidates for dark matter such as axions and MACHOs has been conducted. The explorative study in this paper is conducted through a series of literature reviews.

Keywords: dark matter, particle detection, supersymmetry, weakly interacting massive particles

Procedia PDF Downloads 112
25029 Effectiveness of Catalysis in Ozonation for the Removal of Herbizide 2,4 Dichlorophenoxyacetic Acid from Contaminated Water

Authors: S. Shanthi

Abstract:

Catalyzed oxidation processes show extraordinary guarantee for application in numerous wastewater treatment ranges. Advanced oxidation processes are emerging innovation that might be utilized for particular objectives in wastewater treatment. This research work provides a solution for removal a refractory organic compound 2,4-dichlorophenoxyaceticacid a common water pollutant. All studies were done in batch mode in a constantly stirred reactor. Alternative ozonation processes catalysed by transition metals or granular activated carbon have been investigated for degradation of organics. Catalytic ozonation under study are homogeneous catalytic ozonation, which is based on ozone activation by transition metal ions present in aqueous solution, and secondly as heterogeneous catalytic ozonation in the presence of Granular Activated Carbon (GAC). The present studies reveal that heterogeneous catalytic ozonation using GAC favour the ozonation of 2,4-dichlorophenoxyaceticacid by increasing the rate of ozonation and a much higher degradation of substrates were obtained in a given time. Be that it may, Fe2+and Fe3+ ions decreased the rate of degradation of 2,4-dichlorophenoxyaceticacid indicating that it acts as a negative catalyst. In case of heterogeneous catalytic ozonation using GAC catalyst it was found that during the initial 5 minutes of contact solution concentration decreased significantly as the pollutants were adsorbed initially. Thereafter the substrate started getting oxidized and ozonation became a dominates the treatment process. The exhausted GAC was found to be regenerated in situ. The percentage reduction of the substrate was maximum achieved in minimum possible time when GAC catalyst is employed.

Keywords: ozonation, homogeneous catalysis, heterogeneous catalysis, granular activated carbon

Procedia PDF Downloads 229
25028 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses

Authors: Ouzayr Rabhi, Ibtissam Arrassen

Abstract:

To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.

Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML

Procedia PDF Downloads 132
25027 A Review Paper on Data Security in Precision Agriculture Using Internet of Things

Authors: Tonderai Muchenje, Xolani Mkhwanazi

Abstract:

Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.

Keywords: precision agriculture, security, IoT, EIDE

Procedia PDF Downloads 70
25026 Arthroscopic Superior Capsular Reconstruction Using the Long Head of the Biceps Tendon (LHBT)

Authors: Ho Sy Nam, Tang Ha Nam Anh

Abstract:

Background: Rotator cuff tears are a common problem in the aging population. The prevalence of massive rotator cuff tears varies in some studies from 10% to 40%. Of irreparable rotator cuff tears (IRCTs), which are mostly associated with massive tear size, 79% are estimated to have recurrent tears after surgical repair. Recent studies have shown that superior capsule reconstruction (SCR) in massive rotator cuff tears can be an efficient technique with optimistic clinical scores and preservation of stable glenohumeral stability. Superior capsule reconstruction techniques most commonly use either fascia lata autograft or dermal allograft, both of which have their own benefits and drawbacks (such as the potential for donor site issues, allergic reactions, and high cost). We propose a simple technique for superior capsule reconstruction that involves using the long head of the biceps tendon as a local autograft; therefore, the comorbidities related to graft harvesting are eliminated. The long head of the biceps tendon proximal portion is relocated to the footprint and secured as the SCR, serving to both stabilize the glenohumeral joint and maintain vascular supply to aid healing. Objective: The purpose of this study is to assess the clinical outcomes of patients with large to massive RCTs treated by SCR using LHBT. Materials and methods: A study was performed of consecutive patients with large to massive RCTs who were treated by SCR using LHBT between January 2022 and December 2022. We use one double-loaded suture anchor to secure the long head of the biceps to the middle of the footprint. Two more anchors are used to repair the rotator cuff using a single-row technique, which is placed anteriorly and posteriorly on the lateral side of the previously transposed LHBT. Results: The 3 men and 5 women had an average age of 61.25 years (range 48 to 76 years) at the time of surgery. The average follow-up was 8.2 months (6 to 10 months) after surgery. The average preoperative ASES was 45.8, and the average postoperative ASES was 85.83. The average postoperative UCLA score was 29.12. VAS score was improved from 5.9 to 1.12. The mean preoperative ROM of forward flexion and external rotation of the shoulder was 720 ± 160 and 280 ± 80, respectively. The mean postoperative ROM of forward flexion and external rotation were 1310 ± 220 and 630 ± 60, respectively. There were no cases of progression of osteoarthritis or rotator cuff muscle atrophy. Conclusion: SCR using LHBT is considered a treatment option for patients with large or massive RC tears. It can restore superior glenohumeral stability and function of the shoulder joint and can be an effective procedure for selected patients, helping to avoid progression to cuff tear arthropathy.

Keywords: superior capsule reconstruction, large or massive rotator cuff tears, the long head of the biceps, stabilize the glenohumeral joint

Procedia PDF Downloads 58
25025 New Two-Way Map-Reduce Join Algorithm: Hash Semi Join

Authors: Marwa Hussein Mohamed, Mohamed Helmy Khafagy, Samah Ahmed Senbel

Abstract:

Map Reduce is a programming model used to handle and support massive data sets. Rapidly increasing in data size and big data are the most important issue today to make an analysis of this data. map reduce is used to analyze data and get more helpful information by using two simple functions map and reduce it's only written by the programmer, and it includes load balancing , fault tolerance and high scalability. The most important operation in data analysis are join, but map reduce is not directly support join. This paper explains two-way map-reduce join algorithm, semi-join and per split semi-join, and proposes new algorithm hash semi-join that used hash table to increase performance by eliminating unused records as early as possible and apply join using hash table rather than using map function to match join key with other data table in the second phase but using hash tables isn't affecting on memory size because we only save matched records from the second table only. Our experimental result shows that using a hash table with hash semi-join algorithm has higher performance than two other algorithms while increasing the data size from 10 million records to 500 million and running time are increased according to the size of joined records between two tables.

Keywords: map reduce, hadoop, semi join, two way join

Procedia PDF Downloads 493
25024 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis

Authors: C. B. Le, V. N. Pham

Abstract:

In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.

Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering

Procedia PDF Downloads 159
25023 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms

Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy

Abstract:

Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.

Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu

Procedia PDF Downloads 461
25022 Weight Regulation Mechanism on Bridges

Authors: S. Siddharth, Saravana Kumar

Abstract:

All Metros across the world tend to have a large number of bridges and there have been concerns about the safety of these bridges. As the traffic in most cities in India is heterogeneous, Trucks and Heavy vehicles traverse on our roads on an everyday basis this will lead to structural damage on the long run. All bridges are designed with a maximum Load limit and this limit is seldom checked. We have hence come up with an idea to check the load of all the vehicles entering the bridge and block the bridge with barricades if the vehicle surpasses the maximum load , this is done to catch hold of the perpetrators. By doing this we can avoid further structural damage and also provide an effective way to enforce the law. If our solution is put in place structural damage and accidents would be reduced to a great deal and it would also make the law enforcement job easier.

Keywords: heterogeneous, structural, load, law, heavy, vehicles

Procedia PDF Downloads 419
25021 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification

Authors: Samiah Alammari, Nassim Ammour

Abstract:

When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on HSI dataset Indian Pines. The results confirm the capability of the proposed method.

Keywords: continual learning, data reconstruction, remote sensing, hyperspectral image segmentation

Procedia PDF Downloads 222
25020 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements

Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria

Abstract:

The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.

Keywords: strut and tie, optimization, reinforcement, massive structure

Procedia PDF Downloads 125
25019 Ultrasonic Degradation of Acephate: Effects of Operating Parameters

Authors: Naina Deshmukh

Abstract:

With the wide production, consumption, and disposal of pesticides in the world, the concerns over their human and environmental health impacts are rapidly growing. Among developing treatment technologies, Ultrasonication, as an emerging and promising technology for the removal of pesticides in the aqueous environment, has attracted the attention of many researchers in recent years. The degradation of acephate in aqueous solutions was investigated under the influence of ultrasound irradiation (20 kHz) in the presence of heterogeneous catalysts titanium dioxide (TiO2) and Zinc oxide (ZnO). The influence of various factors such as amount of catalyst (0.25, 0.5, 0.75, 1.0, 1.25 g/l), initial acephate concentration (100, 200, 300, 400 mg/l), and pH (3, 5, 7, 9, 11) were studied. The optimum catalyst dose was found to be 1 g/l of TiO2 and 1.25 g/l of ZnO for acephate at 100 mg/l, respectively. The maximum percentage degradation of acephate was observed at pH 11 for catalysts TiO2 and ZnO, respectively.

Keywords: ultrasonic degradation, acephate, TiO2, ZnO, heterogeneous catalyst

Procedia PDF Downloads 33