Search results for: heterogeneous massive data
25869 End to End Monitoring in Oracle Fusion Middleware for Data Verification
Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan
Abstract:
In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring
Procedia PDF Downloads 48025868 Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses
Authors: Nabil Sultan
Abstract:
A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).Keywords: MOOCs, disruptive innovations, higher education, jobs theory
Procedia PDF Downloads 27025867 Prediction of Marine Ecosystem Changes Based on the Integrated Analysis of Multivariate Data Sets
Authors: Prozorkevitch D., Mishurov A., Sokolov K., Karsakov L., Pestrikova L.
Abstract:
The current body of knowledge about the marine environment and the dynamics of marine ecosystems includes a huge amount of heterogeneous data collected over decades. It generally includes a wide range of hydrological, biological and fishery data. Marine researchers collect these data and analyze how and why the ecosystem changes from past to present. Based on these historical records and linkages between the processes it is possible to predict future changes. Multivariate analysis of trends and their interconnection in the marine ecosystem may be used as an instrument for predicting further ecosystem evolution. A wide range of information about the components of the marine ecosystem for more than 50 years needs to be used to investigate how these arrays can help to predict the future.Keywords: barents sea ecosystem, abiotic, biotic, data sets, trends, prediction
Procedia PDF Downloads 11625866 Heterogeneous Photocatalytic Degradation of Methylene Blue by Montmorillonite/CuxCd1-xs Nanomaterials
Authors: Horiya Boukhatem, Lila Djouadi, Hussein Khalaf, Rufino Manuel Navarro Yerga, Fernando Vaquero Gonzalez
Abstract:
Heterogeneous photo catalysis is an alternative method for the removal of organic pollutants in water. The photo excitation of a semi-conductor under ultra violet (UV) irradiation entails the production of hydroxyl radicals, one of the most oxidative chemical species. The objective of this study is the synthesis of nano materials based on montmorillonite and CuxCd1-xS with different Cu concentration (0.3 < x < 0.7) and their application in photocatalysis of a cationic dye: methylene blue. The synthesized nano materials and montmorillonite were characterized by fourier transform infrared (FTIR). Test results of photo catalysis of methylene blue under UV-Visible irradiation show that the photoactivity of nano materials montmorillonite/ CuxCd1-xS increase with the increasing of Cu concentration and it is significantly higher compared to that of sodium montmorillonite alone. The application of the kinetic model of Langmuir-Hinshelwood (L-H) to the photocatalytic test results showed that the reaction rate obeys to the first-order kinetic model.Keywords: heterogeneous photo catalysis, methylene blue, montmorillonite, nano material
Procedia PDF Downloads 33925865 Algorithms for Run-Time Task Mapping in NoC-Based Heterogeneous MPSoCs
Authors: M. K. Benhaoua, A. K. Singh, A. E. Benyamina, P. Boulet
Abstract:
Mapping parallelized tasks of applications onto these MPSoCs can be done either at design time (static) or at run-time (dynamic). Static mapping strategies find the best placement of tasks at design-time, and hence, these are not suitable for dynamic workload and seem incapable of runtime resource management. The number of tasks or applications executing in MPSoC platform can exceed the available resources, requiring efficient run-time mapping strategies to meet these constraints. This paper describes a new Spiral Dynamic Task Mapping heuristic for mapping applications onto NoC-based Heterogeneous MPSoC. This heuristic is based on packing strategy and routing Algorithm proposed also in this paper. Heuristic try to map the tasks of an application in a clustering region to reduce the communication overhead between the communicating tasks. The heuristic proposed in this paper attempts to map the tasks of an application that are most related to each other in a spiral manner and to find the best possible path load that minimizes the communication overhead. In this context, we have realized a simulation environment for experimental evaluations to map applications with varying number of tasks onto an 8x8 NoC-based Heterogeneous MPSoCs platform, we demonstrate that the new mapping heuristics with the new modified dijkstra routing algorithm proposed are capable of reducing the total execution time and energy consumption of applications when compared to state-of-the-art run-time mapping heuristics reported in the literature.Keywords: multiprocessor system on chip, MPSoC, network on chip, NoC, heterogeneous architectures, run-time mapping heuristics, routing algorithm
Procedia PDF Downloads 48925864 Bioinformatics High Performance Computation and Big Data
Authors: Javed Mohammed
Abstract:
Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.Keywords: high performance, big data, parallel computation, molecular data, computational biology
Procedia PDF Downloads 36325863 Interpreting Privacy Harms from a Non-Economic Perspective
Authors: Christopher Muhawe, Masooda Bashir
Abstract:
With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.Keywords: data breach and misuse, economic harms, privacy harms, psychological harms
Procedia PDF Downloads 19525862 Optimization of Biodiesel Production from Palm Oil over Mg-Al Modified K-10 Clay Catalyst
Authors: Muhammad Ayoub, Abrar Inayat, Bhajan Lal, Sintayehu Mekuria Hailegiorgis
Abstract:
Biodiesel which comes from pure renewable resources provide an alternative fuel option for future because of limited fossil fuel resources as well as environmental concerns. The transesterification of vegetable oils for biodiesel production is a promising process to overcome this future crises of energy. The use of heterogeneous catalysts greatly simplifies the technological process by facilitating the separation of the post-reaction mixture. The purpose of the present work was to examine a heterogeneous catalyst, in particular, Mg-Al modified K-10 clay, to produce methyl esters of palm oil. The prepared catalyst was well characterized by different latest techniques. In this study, the transesterification of palm oil with methanol was studied in a heterogeneous system in the presence of Mg-Al modified K-10 clay as solid base catalyst and then optimized these results with the help of Design of Experiments software. The results showed that methanol is the best alcohol for this reaction condition. The best results was achieved for optimization of biodiesel process. The maximum conversion of triglyceride (88%) was noted after 8 h of reaction at 60 ̊C, with a 6:1 molar ratio of methanol to palm oil and 3 wt % of prepared catalyst.Keywords: palm oil, transestrefication, clay, biodiesel, mesoporous clay, K-10
Procedia PDF Downloads 39525861 The Potential Impact of Big Data Analytics on Pharmaceutical Supply Chain Management
Authors: Maryam Ziaee, Himanshu Shee, Amrik Sohal
Abstract:
Big Data Analytics (BDA) in supply chain management has recently drawn the attention of academics and practitioners. Big data refers to a massive amount of data from different sources, in different formats, generated at high speed through transactions in business environments and supply chain networks. Traditional statistical tools and techniques find it difficult to analyse this massive data. BDA can assist organisations to capture, store, and analyse data specifically in the field of supply chain. Currently, there is a paucity of research on BDA in the pharmaceutical supply chain context. In this research, the Australian pharmaceutical supply chain was selected as the case study. This industry is highly significant since the right medicine must reach the right patients, at the right time, in right quantity, in good condition, and at the right price to save lives. However, drug shortages remain a substantial problem for hospitals across Australia with implications on patient care, staff resourcing, and expenditure. Furthermore, a massive volume and variety of data is generated at fast speed from multiple sources in pharmaceutical supply chain, which needs to be captured and analysed to benefit operational decisions at every stage of supply chain processes. As the pharmaceutical industry lags behind other industries in using BDA, it raises the question of whether the use of BDA can improve transparency among pharmaceutical supply chain by enabling the partners to make informed-decisions across their operational activities. This presentation explores the impacts of BDA on supply chain management. An exploratory qualitative approach was adopted to analyse data collected through interviews. This study also explores the BDA potential in the whole pharmaceutical supply chain rather than focusing on a single entity. Twenty semi-structured interviews were undertaken with top managers in fifteen organisations (five pharmaceutical manufacturers, five wholesalers/distributors, and five public hospital pharmacies) to investigate their views on the use of BDA. The findings revealed that BDA can enable pharmaceutical entities to have improved visibility over the whole supply chain and also the market; it enables entities, especially manufacturers, to monitor consumption and the demand rate in real-time and make accurate demand forecasts which reduce drug shortages. Timely and precise decision-making can allow the entities to source and manage their stocks more effectively. This can likely address the drug demand at hospitals and respond to unanticipated issues such as drug shortages. Earlier studies explore BDA in the context of clinical healthcare; however, this presentation investigates the benefits of BDA in the Australian pharmaceutical supply chain. Furthermore, this research enhances managers’ insight into the potentials of BDA at every stage of supply chain processes and helps to improve decision-making in their supply chain operations. The findings will turn the rhetoric of data-driven decision into a reality where the managers may opt for analytics for improved decision-making in the supply chain processes.Keywords: big data analytics, data-driven decision, pharmaceutical industry, supply chain management
Procedia PDF Downloads 10625860 Multi-Perspective Learning in a Real Production Plant Using Experiential Learning in Heterogeneous Groups to Develop System Competencies for Production System Improvements
Authors: Marlies Achenbach
Abstract:
System competencies play a key role to ensure an effective and efficient improvement of production systems. Thus, there can be observed an increasing demand for developing system competencies in industry as well as in engineering education. System competencies consist of the following two main abilities: Evaluating the current state of a production system and developing a target state. The innovative course ‘multi-perspective learning in a real production plant (multi real)’ is developed to create a learning setting that supports the development of these system competencies. Therefore, the setting combines two innovative aspects: First, the Learning takes place in heterogeneous groups formed by students as well as professionals and managers from industry. Second, the learning takes place in a real production plant. This paper presents the innovative didactic concept of ‘multi real’ in detail, which will initially be implemented in October/November 2016 in the industrial engineering, logistics and mechanical master’s program at TU Dortmund University.Keywords: experiential learning, heterogeneous groups, improving production systems, system competencies
Procedia PDF Downloads 42625859 Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule
Authors: Ming-Jong Yao, Chin-Sum Shui, Chih-Han Wang
Abstract:
This paper is developed based on a real-world decision scenario that an industrial gas company that applies the Vendor Managed Inventory model and supplies liquid oxygen with a self-operated heterogeneous vehicle fleet to hospitals in nearby cities. We name it as a Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule and formulate it as a non-linear mixed-integer linear programming problem which simultaneously determines the length of the planning cycle (PC), the length of the replenishment cycle and the dates of replenishment for each customer and the vehicle routes of each day within PC, such that the average daily operation cost within PC, including inventory holding cost, setup cost, transportation cost, and overtime labor cost, is minimized. A solution method based on genetic algorithm, embedded with an encoding and decoding mechanism and local search operators, is then proposed, and the hash function is adopted to avoid repetitive fitness evaluation for identical solutions. Numerical experiments demonstrate that the proposed solution method can effectively solve the problem under different lengths of PC and number of customers. The method is also shown to be effective in determining whether the company should expand the storage capacity of a customer whose demand increases. Sensitivity analysis of the vehicle fleet composition shows that deploying a mixed fleet can reduce the daily operating cost.Keywords: cyclic inventory routing problem, joint replenishment, heterogeneous vehicle, genetic algorithm
Procedia PDF Downloads 8725858 Tracing Digital Traces of Phatic Communion in #Mooc
Authors: Judith Enriquez-Gibson
Abstract:
This paper meddles with the notion of phatic communion introduced 90 years ago by Malinowski, who was a Polish-born British anthropologist. It explores the phatic in Twitter within the contents of tweets related to moocs (massive online open courses) as a topic or trend. It is not about moocs though. It is about practices that could easily be hidden or neglected if we let big or massive topics take the lead or if we simply follow the computational or secret codes behind Twitter itself and third party software analytics. It draws from media and cultural studies. Though at first it appears data-driven as I submitted data collection and analytics into the hands of a third party software, Twitonomy, the aim is to follow how phatic communion might be practised in a social media site, such as Twitter. Lurking becomes its research method to analyse mooc-related tweets. A total of 3,000 tweets were collected on 11 October 2013 (UK timezone). The emphasis of lurking is to engage with Twitter as a system of connectivity. One interesting finding is that a click is in fact a phatic practice. A click breaks the silence. A click in one of the mooc website is actually a tweet. A tweet was posted on behalf of a user who simply chose to click without formulating the text and perhaps without knowing that it contains #mooc. Surely, this mechanism is not about reciprocity. To break the silence, users did not use words. They just clicked the ‘tweet button’ on a mooc website. A click performs and maintains connectivity – and Twitter as the medium in attendance in our everyday, available when needed to be of service. In conclusion, the phatic culture of breaking silence in Twitter does not have to submit to the power of code and analytics. It is a matter of human code.Keywords: click, Twitter, phatic communion, social media data, mooc
Procedia PDF Downloads 41225857 Feasibility of Weakly Interacting Massive Particles as Dark Matter Candidates: Exploratory Study on The Possible Reasons for Lack of WIMP Detection
Authors: Sloka Bhushan
Abstract:
Dark matter constitutes a majority of matter in the universe, yet very little is known about it due to its extreme lack of interaction with regular matter and the fundamental forces. Weakly Interacting Massive Particles, or WIMPs, have been contested to be one of the strongest candidates for dark matter due to their promising theoretical properties. However, various endeavors to detect these elusive particles have failed. This paper explores the various particles which may be WIMPs and the detection techniques being employed to detect WIMPs (such as underground detectors, LHC experiments, and so on). There is a special focus on the reasons for the lack of detection of WIMPs so far, and the possibility of limits in detection being a reason for the lack of physical evidence of the existence of WIMPs. This paper also explores possible inconsistencies within the WIMP particle theory as a reason for the lack of physical detection. There is a brief review on the possible solutions and alternatives to these inconsistencies. Additionally, this paper also reviews the supersymmetry theory and the possibility of the supersymmetric neutralino (A possible WIMP particle) being detectable. Lastly, a review on alternate candidates for dark matter such as axions and MACHOs has been conducted. The explorative study in this paper is conducted through a series of literature reviews.Keywords: dark matter, particle detection, supersymmetry, weakly interacting massive particles
Procedia PDF Downloads 14225856 Effectiveness of Catalysis in Ozonation for the Removal of Herbizide 2,4 Dichlorophenoxyacetic Acid from Contaminated Water
Authors: S. Shanthi
Abstract:
Catalyzed oxidation processes show extraordinary guarantee for application in numerous wastewater treatment ranges. Advanced oxidation processes are emerging innovation that might be utilized for particular objectives in wastewater treatment. This research work provides a solution for removal a refractory organic compound 2,4-dichlorophenoxyaceticacid a common water pollutant. All studies were done in batch mode in a constantly stirred reactor. Alternative ozonation processes catalysed by transition metals or granular activated carbon have been investigated for degradation of organics. Catalytic ozonation under study are homogeneous catalytic ozonation, which is based on ozone activation by transition metal ions present in aqueous solution, and secondly as heterogeneous catalytic ozonation in the presence of Granular Activated Carbon (GAC). The present studies reveal that heterogeneous catalytic ozonation using GAC favour the ozonation of 2,4-dichlorophenoxyaceticacid by increasing the rate of ozonation and a much higher degradation of substrates were obtained in a given time. Be that it may, Fe2+and Fe3+ ions decreased the rate of degradation of 2,4-dichlorophenoxyaceticacid indicating that it acts as a negative catalyst. In case of heterogeneous catalytic ozonation using GAC catalyst it was found that during the initial 5 minutes of contact solution concentration decreased significantly as the pollutants were adsorbed initially. Thereafter the substrate started getting oxidized and ozonation became a dominates the treatment process. The exhausted GAC was found to be regenerated in situ. The percentage reduction of the substrate was maximum achieved in minimum possible time when GAC catalyst is employed.Keywords: ozonation, homogeneous catalysis, heterogeneous catalysis, granular activated carbon
Procedia PDF Downloads 25025855 Assessment the Tsunamis Impact with Tectonic Sources in the Southern Mainland of the Haitian Republic: Using Two Numerical Models
Authors: Delva Richard, Zahibo Narcisse, Yalciner Ahmet
Abstract:
The Republic of Haiti is one of the poor countries of the world, therefore the authorities must make choices to provide timely solutions to the many difficulties that this Caribbean country is experiencing. There is a very acute lack of scientific research to study natural phenomena in depth. A working group meeting was established under the aegis of the World Bank, UNESCO and the authorities, to study the level of exposure of the Hispaniola. The devastating earthquake of August 2021 killed about 2100 and caused massive material damage; and the 14 12 January 2010 killed more than 250,000 people and caused massive material damage, the evidence of which is still 11 years later. In this paper we want to contribute to the assessment of the risk of tsunami on the southern peninsula of the Republic of Haiti. For the realization of this work we have the bathymetric and topographic data of very good qualities from the private measurement campaigns that we have combined with GEBCO for the inundation grids. We use two numerical models MOST and NAMI DANCE for the calculation of the parameters required in any tsunami risk assessment.Keywords: modélisation numérique, ondes longues océaniques, bathymetrie, evaluation risque, tsunamis
Procedia PDF Downloads 625854 The Effect of Subsurface Dam on Saltwater Intrusion in Heterogeneous Coastal Aquifers
Authors: Antoifi Abdoulhalik, Ashraf Ahmed
Abstract:
Saltwater intrusion (SWI) in coastal aquifers has become a growing threat for many countries around the world. While various control measures have been suggested to mitigate SWI, the construction of subsurface physical barriers remains one of the most effective solutions for this problem. In this work, we used laboratory experiments and numerical simulations to investigate the effectiveness of subsurface dams in heterogeneous layered coastal aquifer with different layering patterns. Four different cases were investigated, including a homogeneous (case H), and three heterogeneous cases in which a low permeability (K) layer was set in the top part of the system (case LH), in the middle part of the system (case HLH) and the bottom part of the system (case HL). Automated image analysis technique was implemented to quantify the main SWI parameters under high spatial and temporal resolution. The method also provides transient salt concentration maps, allowing for the first time clear visualization of the spillage of saline water over the dam (advancing wedge condition) as well as the flushing of residual saline water from the freshwater area (receding wedge condition). The SEAWAT code was adopted for the numerical simulations. The results show that the presence of an overlying layer of low permeability enhanced the ability of the dam to retain the saline water. In such conditions, the rate of saline water spillage and inland extension may considerably be reduced. Conversely, the presence of an underlying low K layer led to a faster increase of saltwater volume on the seaward side of the wall, therefore considerably facilitating the spillage. The results showed that a complete removal of the residual saline water eventually occurred in all the investigated scenarios, with a rate of removal strongly affected by the hydraulic conductivity of the lower part of the aquifer. The data showed that the addition of the underlying low K layer in case HL caused the complete flushing to be almost twice longer than in the homogeneous scenario.Keywords: heterogeneous coastal aquifers, laboratory experiments, physical barriers, seawater intrusion control
Procedia PDF Downloads 25025853 Massive Intrapartum Hemorrhage Following by Inner Myometrial Laceration during a Vaginal Delivery: A Rare Case Report
Authors: Bahareh Khakifirooz, Arian Shojaei, Amirhossein Hajialigol, Bahare Abdolahi
Abstract:
Laceration of the inner layer of the myometrium can cause massive bleeding during and after childbirth, which can lead to the death of the mother if it is not diagnosed in time. We studied a rare case of massive intrapartum bleeding following myometrial laceration that was diagnosed correctly, and the patient survived with in-time treatments. The patient was a 26 years-old woman who was under observation for term pregnancy and complaint of rupture of membranes (ROM) and vaginal bleeding. Following the spontaneous course of labor and without receiving oxytocin, during the normal course of labor, she had an estimated total blood loss of 750 mL bleeding, which, despite the normal fetal heart rate and with the mother's indication for cesarean section, was transferred to the operating room and underwent cesarean section. During the cesarean section, the amniotic fluid was clear; after the removal of the placenta, severe and clear bleeding was flowing from the posterior wall of the uterus, which was caused by the laceration of the inner layer of the myometrium in the posterior wall of the lower segment of the uterus. The myometrial laceration was repaired with absorbable continuous locked sutures, and hemostasis was established, then, the patient used uterotonic drugs, and after monitoring, the patient was discharged from the hospital in good condition.Keywords: intrapartum hemorrhage, inner myometrial laceration, labor, Increased intrauterine pressure
Procedia PDF Downloads 2525852 Lithium Ion Supported on TiO2 Mixed Metal Oxides as a Heterogeneous Catalyst for Biodiesel Production from Canola Oil
Authors: Mariam Alsharifi, Hussein Znad, Ming Ang
Abstract:
Considering the environmental issues and the shortage in the conventional fossil fuel sources, biodiesel has gained a promising solution to shift away from fossil based fuel as one of the sustainable and renewable energy. It is synthesized by transesterification of vegetable oils or animal fats with alcohol (methanol or ethanol) in the presence of a catalyst. This study focuses on synthesizing a high efficient Li/TiO2 heterogeneous catalyst for biodiesel production from canola oil. In this work, lithium immobilized onto TiO2 by the simple impregnation method. The catalyst was evaluated by transesterification reaction in a batch reactor under moderate reaction conditions. To study the effect of Li concentrations, a series of LiNO3 concentrations (20, 30, 40 wt. %) at different calcination temperatures (450, 600, 750 ºC) were evaluated. The Li/TiO2 catalysts are characterized by several spectroscopic and analytical techniques such as XRD, FT-IR, BET, TG-DSC and FESEM. The optimum values of impregnated Lithium nitrate on TiO2 and calcination temperature are 30 wt. % and 600 ºC, respectively, along with a high conversion to be 98 %. The XRD study revealed that the insertion of Li improved the catalyst efficiency without any alteration in structure of TiO2 The best performance of the catalyst was achieved when using a methanol to oil ratio of 24:1, 5 wt. % of catalyst loading, at 65◦C reaction temperature for 3 hours of reaction time. Moreover, the experimental kinetic data were compatible with the pseudo-first order model and the activation energy was (39.366) kJ/mol. The synthesized catalyst Li/TiO2 was applied to trans- esterify used cooking oil and exhibited a 91.73% conversion. The prepared catalyst has shown a high catalytic activity to produce biodiesel from fresh and used oil within mild reaction conditions.Keywords: biodiesel, canola oil, environment, heterogeneous catalyst, impregnation method, renewable energy, transesterification
Procedia PDF Downloads 17625851 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 43225850 Arthroscopic Superior Capsular Reconstruction Using the Long Head of the Biceps Tendon (LHBT)
Authors: Ho Sy Nam, Tang Ha Nam Anh
Abstract:
Background: Rotator cuff tears are a common problem in the aging population. The prevalence of massive rotator cuff tears varies in some studies from 10% to 40%. Of irreparable rotator cuff tears (IRCTs), which are mostly associated with massive tear size, 79% are estimated to have recurrent tears after surgical repair. Recent studies have shown that superior capsule reconstruction (SCR) in massive rotator cuff tears can be an efficient technique with optimistic clinical scores and preservation of stable glenohumeral stability. Superior capsule reconstruction techniques most commonly use either fascia lata autograft or dermal allograft, both of which have their own benefits and drawbacks (such as the potential for donor site issues, allergic reactions, and high cost). We propose a simple technique for superior capsule reconstruction that involves using the long head of the biceps tendon as a local autograft; therefore, the comorbidities related to graft harvesting are eliminated. The long head of the biceps tendon proximal portion is relocated to the footprint and secured as the SCR, serving to both stabilize the glenohumeral joint and maintain vascular supply to aid healing. Objective: The purpose of this study is to assess the clinical outcomes of patients with large to massive RCTs treated by SCR using LHBT. Materials and methods: A study was performed of consecutive patients with large to massive RCTs who were treated by SCR using LHBT between January 2022 and December 2022. We use one double-loaded suture anchor to secure the long head of the biceps to the middle of the footprint. Two more anchors are used to repair the rotator cuff using a single-row technique, which is placed anteriorly and posteriorly on the lateral side of the previously transposed LHBT. Results: The 3 men and 5 women had an average age of 61.25 years (range 48 to 76 years) at the time of surgery. The average follow-up was 8.2 months (6 to 10 months) after surgery. The average preoperative ASES was 45.8, and the average postoperative ASES was 85.83. The average postoperative UCLA score was 29.12. VAS score was improved from 5.9 to 1.12. The mean preoperative ROM of forward flexion and external rotation of the shoulder was 720 ± 160 and 280 ± 80, respectively. The mean postoperative ROM of forward flexion and external rotation were 1310 ± 220 and 630 ± 60, respectively. There were no cases of progression of osteoarthritis or rotator cuff muscle atrophy. Conclusion: SCR using LHBT is considered a treatment option for patients with large or massive RC tears. It can restore superior glenohumeral stability and function of the shoulder joint and can be an effective procedure for selected patients, helping to avoid progression to cuff tear arthropathy.Keywords: superior capsule reconstruction, large or massive rotator cuff tears, the long head of the biceps, stabilize the glenohumeral joint
Procedia PDF Downloads 7725849 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering
Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining
Abstract:
DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)
Procedia PDF Downloads 27825848 Nazca: A Context-Based Matching Method for Searching Heterogeneous Structures
Authors: Karine B. de Oliveira, Carina F. Dorneles
Abstract:
The structure level matching is the problem of combining elements of a structure, which can be represented as entities, classes, XML elements, web forms, and so on. This is a challenge due to large number of distinct representations of semantically similar structures. This paper describes a structure-based matching method applied to search for different representations in data sources, considering the similarity between elements of two structures and the data source context. Using real data sources, we have conducted an experimental study comparing our approach with our baseline implementation and with another important schema matching approach. We demonstrate that our proposal reaches higher precision than the baseline.Keywords: context, data source, index, matching, search, similarity, structure
Procedia PDF Downloads 36425847 Weight Regulation Mechanism on Bridges
Authors: S. Siddharth, Saravana Kumar
Abstract:
All Metros across the world tend to have a large number of bridges and there have been concerns about the safety of these bridges. As the traffic in most cities in India is heterogeneous, Trucks and Heavy vehicles traverse on our roads on an everyday basis this will lead to structural damage on the long run. All bridges are designed with a maximum Load limit and this limit is seldom checked. We have hence come up with an idea to check the load of all the vehicles entering the bridge and block the bridge with barricades if the vehicle surpasses the maximum load , this is done to catch hold of the perpetrators. By doing this we can avoid further structural damage and also provide an effective way to enforce the law. If our solution is put in place structural damage and accidents would be reduced to a great deal and it would also make the law enforcement job easier.Keywords: heterogeneous, structural, load, law, heavy, vehicles
Procedia PDF Downloads 45225846 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements
Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria
Abstract:
The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.Keywords: strut and tie, optimization, reinforcement, massive structure
Procedia PDF Downloads 14125845 Machine Learning-Enabled Classification of Climbing Using Small Data
Authors: Nicholas Milburn, Yu Liang, Dalei Wu
Abstract:
Athlete performance scoring within the climbing do-main presents interesting challenges as the sport does not have an objective way to assign skill. Assessing skill levels within any sport is valuable as it can be used to mark progress while training, and it can help an athlete choose appropriate climbs to attempt. Machine learning-based methods are popular for complex problems like this. The dataset available was composed of dynamic force data recorded during climbing; however, this dataset came with challenges such as data scarcity, imbalance, and it was temporally heterogeneous. Investigated solutions to these challenges include data augmentation, temporal normalization, conversion of time series to the spectral domain, and cross validation strategies. The investigated solutions to the classification problem included light weight machine classifiers KNN and SVM as well as the deep learning with CNN. The best performing model had an 80% accuracy. In conclusion, there seems to be enough information within climbing force data to accurately categorize climbers by skill.Keywords: classification, climbing, data imbalance, data scarcity, machine learning, time sequence
Procedia PDF Downloads 14225844 Unified Structured Process for Health Analytics
Authors: Supunmali Ahangama, Danny Chiang Choon Poo
Abstract:
Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.Keywords: agile methodology, health analytics, unified process model, UML
Procedia PDF Downloads 50625843 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 25225842 Ultrasonic Degradation of Acephate: Effects of Operating Parameters
Authors: Naina Deshmukh
Abstract:
With the wide production, consumption, and disposal of pesticides in the world, the concerns over their human and environmental health impacts are rapidly growing. Among developing treatment technologies, Ultrasonication, as an emerging and promising technology for the removal of pesticides in the aqueous environment, has attracted the attention of many researchers in recent years. The degradation of acephate in aqueous solutions was investigated under the influence of ultrasound irradiation (20 kHz) in the presence of heterogeneous catalysts titanium dioxide (TiO2) and Zinc oxide (ZnO). The influence of various factors such as amount of catalyst (0.25, 0.5, 0.75, 1.0, 1.25 g/l), initial acephate concentration (100, 200, 300, 400 mg/l), and pH (3, 5, 7, 9, 11) were studied. The optimum catalyst dose was found to be 1 g/l of TiO2 and 1.25 g/l of ZnO for acephate at 100 mg/l, respectively. The maximum percentage degradation of acephate was observed at pH 11 for catalysts TiO2 and ZnO, respectively.Keywords: ultrasonic degradation, acephate, TiO2, ZnO, heterogeneous catalyst
Procedia PDF Downloads 6125841 Heterogeneous Catalytic Hydroesterification of Soybean Oil to Develop a Biodiesel Formation
Authors: O. Mowla, E. Kennedy, M. Stockenhuber
Abstract:
Finding alternative renewable resources of energy has attracted the attentions in consequence of limitation of the traditional fossil fuel resources, increasing of crude oil price and environmental concern over greenhouse gas emissions. Biodiesel (or Fatty Acid Methyl Esters (FAME)), an alternative energy source, is synthesised from renewable sources such as vegetable oils and animal fats and can be produced from waste oils. FAME can be produced via hydroesterification of oils. The process involves two stages. In the first stage of this process, fatty acids and glycerol are being obtained by hydrolysis of the feed stock oil. In the second stage, the recovered fatty acids are then esterified with an alcohol to methyl esters. The presence of a catalyst accelerates the rate of the hydroesterification reaction of oils. The overarching aim of this study is to find the effect of using zeolite as a catalyst in the heterogeneous hydroesterification of soybean oil. Both stages of the catalytic hydroesterification of soybean oil had been conducted at atmospheric and high-pressure conditions using reflux glass reactor and Parr reactor, respectively. The effect of operating parameters such as temperature and reaction time on the overall yield of biodiesel formation was also investigated.Keywords: biodiesel, heterogeneous catalytic hydroesterification, soybean oil, zeolite
Procedia PDF Downloads 43325840 A Review Paper on Data Security in Precision Agriculture Using Internet of Things
Authors: Tonderai Muchenje, Xolani Mkhwanazi
Abstract:
Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.Keywords: precision agriculture, security, IoT, EIDE
Procedia PDF Downloads 90