Search results for: electronic data interchange
25595 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products
Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry
Abstract:
The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively
Procedia PDF Downloads 9025594 Identifying E-Learning Components at North-West University, Mafikeng Campus
Authors: Sylvia Tumelo Nthutang, Nehemiah Mavetera
Abstract:
Educational institutions are under pressure from their competitors. Regulators and community groups need educational institutions to adopt appropriate business and organizational practices. Globally, educational institutions are now using e-learning as the best teaching and learning approach. E-learning is becoming the center of attention to the learning institutions, educational systems and software inventors. North-West University (NWU) is currently using eFundi, a Learning Management System (LMS). LMS are all information systems and procedures that adds value to students learning and support the learning material in text or any multimedia files. With various e-learning tools, students would be able to access all the materials related to the course in electronic copies. The study was tasked with identifying the e-learning components at the NWU, Mafikeng campus. Quantitative research methodology was considered in data collection and descriptive statistics for data analysis. The Activity Theory (AT) was used as a theory to guide the study. AT outlines the limitations amongst e-learning at the macro-organizational level (plan, guiding principle, campus-wide solutions) and micro-organization (daily functioning practice, collaborative transformation, specific adaptation). On a technological environment, AT gives people an opportunity to change from concentrating on computers as an area of concern but also understand that technology is part of human activities. The findings have identified the university’s current IT tools and knowledge on e-learning elements. It was recommended that university should consider buying computer resources that consumes less power and practice e-learning effectively.Keywords: e-learning, information and communication technology (ICT), teaching, virtual learning environment
Procedia PDF Downloads 27925593 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8225592 The Aromaticity of P-Substituted O-(N-Dialkyl)Aminomethylphenols
Authors: Khodzhaberdi Allaberdiev
Abstract:
Aromaticity, one of the most important concepts in organic chemistry, has attracted considerable interest from both experimentalists and theoreticians. The geometry optimization of p-substituted o-(N-dialkyl)aminomethylphenols, o-DEAMPH XC₆ H₅CH ₂Y (X=p-OCH₃, CH₃, H, F, Cl, Br, COCH₃, COOCH₃, CHO, CN and NO₂, Y=o-N (C₂H₅)₂, o-DEAMPHs have been performed in the gas phase using the B3LYP/6-311+G(d,p) level. Aromaticities of the considered molecules were investigated using different indices included geometrical (HOMA and Bird), electronic (FLU, PDI and SA) magnetic (NICS(0), NICS(1) and NICS(1)zz indices. The linear dependencies were obtained between some aromaticity indices. The best correlation is observed between the Bird and PDI indices (R² =0.9240). However, not all types of indices or even different indices within the same type correlate well among each other. Surprisingly, for studied molecules in which geometrical and electronic cannot correctly give the aromaticity of ring, the magnetism based index successfully predicts the aromaticity of systems. 1H NMR spectra of compounds were obtained at B3LYP/6–311+G(d,p) level using the GIAO method. Excellent linear correlation (R²= 0.9996) between values the chemical shift of hydrogen atom obtained experimentally of 1H NMR and calculated using B3LYP/6–311+G(d,p) demonstrates a good assignment of the experimental values chemical shift to the calculated structures of o-DEAMPH. It is found that the best linear correlation with the Hammett substituent constants is observed for the NICS(1)zz index in comparison with the other indices: NICS(1)zz =-21.5552+1,1070 σp- (R²=0.9394). The presence intramolecular hydrogen bond in the studied molecules also revealed changes the aromatic character of substituted o-DEAMPHs. The HOMA index predicted for R=NO2 the reduction in the π-electron delocalization of 3.4% was about double that observed for p-nitrophenol. The influence intramolecular H-bonding on aromaticity of benzene ring in the ground state (S0) are described by equations between NICS(1)zz and H-bond energies: experimental, Eₑₓₚ, predicted IR spectroscopical, Eν and topological, EQTAIM with correlation coefficients R² =0.9666, R² =0.9028 and R² =0.8864, respectively. The NICS(1)zz index also correlates with usual descriptors of the hydrogen bond, while the other indices do not give any meaningful results. The influence of the intramolecular H-bonding formation on the aromaticity of some substituted o-DEAMPHs is criteria to consider the multidimensional character of aromaticity. The linear relationships as well as revealed between NICS(1)zz and both pyramidality nitrogen atom, ΣN(C₂H₅)₂ and dihedral angle, φ CAr – CAr -CCH₂ –N, to characterizing out-of-plane properties.These results demonstrated the nonplanar structure of o-DEAMPHs. Finally, when considering dependencies of NICS(1)zz, were excluded data for R=H, because the NICS(1) and NICS(1)zz values are the most negative for unsubstituted DEAMPH, indicating its highest aromaticity; that was not the case for NICS(0) index.Keywords: aminomethylphenols, DFT, aromaticity, correlations
Procedia PDF Downloads 18125591 Protecting the Cloud Computing Data Through the Data Backups
Authors: Abdullah Alsaeed
Abstract:
Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.
Procedia PDF Downloads 8725590 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area
Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim
Abstract:
In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.Keywords: data estimation, link data, machine learning, road network
Procedia PDF Downloads 51025589 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 43025588 Design of a Telemetry, Tracking, and Command Radio-Frequency Receiver for Small Satellites Based on Commercial Off-The-Shelf Components
Authors: A. Lovascio, A. D’Orazio, V. Centonze
Abstract:
From several years till now the aerospace industry is developing more and more small satellites for Low-Earth Orbit (LEO) missions. Such satellites have a low cost of making and launching since they have a size and weight smaller than other types of satellites. However, because of size limitations, small satellites need integrated electronic equipment based on digital logic. Moreover, the LEOs require telecommunication modules with high throughput to transmit to earth a big amount of data in a short time. In order to meet such requirements, in this paper we propose a Telemetry, Tracking & Command module optimized through the use of the Commercial Off-The-Shelf components. The proposed approach exploits the major flexibility offered by these components in reducing costs and optimizing the performance. The method has been applied in detail for the design of the front-end receiver, which has a low noise figure (1.5 dB) and DC power consumption (smaller than 2 W). Such a performance is particularly attractive since it allows fulfilling the energy budget stringent constraints that are typical for LEO small platforms.Keywords: COTS, LEO, small-satellite, TT&C
Procedia PDF Downloads 13125587 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37125586 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 11025585 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 31225584 Survey on Big Data Stream Classification by Decision Tree
Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi
Abstract:
Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.Keywords: big data, data streams, classification, decision tree
Procedia PDF Downloads 52125583 Establish a Company in Turkey for Foreigners
Authors: Mucahit Unal, Ibrahim Arslan
Abstract:
The New Turkish Commercial Code (TCC) No. 6102 was published in the Official Gazette on February 14, 2011. As stated in the New Turkish Commercial Code No. 6102 and Law No. 6103 on Validity and Application of the Turkish Commercial Code, TCC came into effect on July 1, 2012. The basic purpose of the TCC is to form corporate governance coherent with the international standards; to provide transparency in company management; to adjust the Turkish Commercial Code rules with European Union legislations and to simplify establishing a company for foreigner investors to move investments to Turkish market. In this context according to TCC, joint stock companies and limited liability companies can establish with only one single shareholder; the one single shareholder can be foreigner; all board of director members can be foreigner, also all shareholders and board of director members can be non-resident foreigners. Additionally, TCC does not require physical participation to the general shareholders and board members meetings. TCC allows that the general shareholders and board members meetings can hold in an electronic form and resolution of these meetings may also be approved via electronic signatures. Through this amendment, foreign investors no longer have to deal with red tapes. This amendment also means the TCC prevents foreign companies from incurring unnecessary travel expenses. In accordance with all this amendments about TCC, to invest in Turkish market is easy, simple and transparent for foreigner investors and also investors can establish a company in Turkey, irrespective of nationality or place of residence. This article aims to analyze ‘Establish a Company in Turkey for Foreigners’ and inform investors about investing (especially establishing a company) in the Turkish market.Keywords: establish a company, foreigner investors, invest in Turkish market, Turkish commercial code
Procedia PDF Downloads 26325582 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication
Authors: Aishwarya Shekhar, Himanshu Sharma
Abstract:
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.Keywords: confidentiality, deduplication, data compression, hybridity of cloud
Procedia PDF Downloads 38325581 A Review of Machine Learning for Big Data
Authors: Devatha Kalyan Kumar, Aravindraj D., Sadathulla A.
Abstract:
Big data are now rapidly expanding in all engineering and science and many other domains. The potential of large or massive data is undoubtedly significant, make sense to require new ways of thinking and learning techniques to address the various big data challenges. Machine learning is continuously unleashing its power in a wide range of applications. In this paper, the latest advances and advancements in the researches on machine learning for big data processing. First, the machine learning techniques methods in recent studies, such as deep learning, representation learning, transfer learning, active learning and distributed and parallel learning. Then focus on the challenges and possible solutions of machine learning for big data.Keywords: active learning, big data, deep learning, machine learning
Procedia PDF Downloads 44625580 The Use of Mobile Phones by Refugees to Create Social Connectedness: A Literature Review
Authors: Sarah Vuningoma, Maria Rosa Lorini, Wallace Chigona
Abstract:
Mobile phones are one of the main tools for promoting the wellbeing of people and supporting the integration of communities on the margins such as refugees. Information and Communication Technology has the potential to contribute towards reducing isolation, loneliness, and to assist in improving interpersonal relations and fostering acculturation processes. Therefore, the use of mobile phones by refugees might contribute to their social connectedness. This paper aims to demonstrate how existing literature has shown how the use of mobile phones by refugees should engender social connectedness amongst the refugees. Data for the study are drawn from existing literature; we searched a number of electronic databases for papers published between 2010 and 2019. The main findings of the study relate to the use of mobile phones by refugees to (i) create a sense of belonging, (ii) maintain relationships, and (iii) advance the acculturation process. The analysis highlighted a gap in the research over refugees and social connectedness. In particular, further studies should consider evaluating the differences between those who have a refugee permit, those who are waiting for the refugee permit, and those whose request was denied.Keywords: belonging, mobile phones, refugees, social connectedness
Procedia PDF Downloads 20425579 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights
Authors: Tomy Prihananto, Damar Apri Sudarmadi
Abstract:
Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.Keywords: Indonesia, protection, personal data, privacy, human rights, encryption
Procedia PDF Downloads 18325578 Detection of Triclosan in Water Based on Nanostructured Thin Films
Authors: G. Magalhães-Mota, C. Magro, S. Sério, E. Mateus, P. A. Ribeiro, A. B. Ribeiro, M. Raposo
Abstract:
Triclosan [5-chloro-2-(2,4-dichlorophenoxy) phenol], belonging to the class of Pharmaceuticals and Personal Care Products (PPCPs), is a broad-spectrum antimicrobial agent and bactericide. Because of its antimicrobial efficacy, it is widely used in personal health and skin care products, such as soaps, detergents, hand cleansers, cosmetics, toothpastes, etc. However, it has been considered to disrupt the endocrine system, for instance, thyroid hormone homeostasis and possibly the reproductive system. Considering the widespread use of triclosan, it is expected that environmental and food safety problems regarding triclosan will increase dramatically. Triclosan has been found in river water samples in both North America and Europe and is likely widely distributed wherever triclosan-containing products are used. Although significant amounts are removed in sewage plants, considerable quantities remain in the sewage effluent, initiating widespread environmental contamination. Triclosan undergoes bioconversion to methyl-triclosan, which has been demonstrated to bio accumulate in fish. In addition, triclosan has been found in human urine samples from persons with no known industrial exposure and in significant amounts in samples of mother's milk, demonstrating its presence in humans. The action of sunlight in river water is known to turn triclosan into dioxin derivatives and raises the possibility of pharmacological dangers not envisioned when the compound was originally utilized. The aim of this work is to detect low concentrations of triclosan in an aqueous complex matrix through the use of a sensor array system, following the electronic tongue concept based on impedance spectroscopy. To achieve this goal, we selected the appropriate molecules to the sensor so that there is a high affinity for triclosan and whose sensitivity ensures the detection of concentrations of at least nano-molar. Thin films of organic molecules and oxides have been produced by the layer-by-layer (LbL) technique and sputtered onto glass solid supports already covered by gold interdigitated electrodes. By submerging the films in complex aqueous solutions with different concentrations of triclosan, resistance and capacitance values were obtained at different frequencies. The preliminary results showed that an array of interdigitated electrodes sensor coated or uncoated with different LbL and films, can be used to detect TCS traces in aqueous solutions in a wide range concentration, from 10⁻¹² to 10⁻⁶ M. The PCA method was applied to the measured data, in order to differentiate the solutions with different concentrations of TCS. Moreover, was also possible to trace a curve, the plot of the logarithm of resistance versus the logarithm of concentration, which allowed us to fit the plotted data points with a decreasing straight line with a slope of 0.022 ± 0.006 which corresponds to the best sensitivity of our sensor. To find the sensor resolution near of the smallest concentration (Cs) used, 1pM, the minimum measured value which can be measured with resolution is 0.006, so the ∆logC =0.006/0.022=0.273, and, therefore, C-Cs~0.9 pM. This leads to a sensor resolution of 0.9 pM for the smallest concentration used, 1pM. This attained detection limit is lower than the values obtained in the literature.Keywords: triclosan, layer-by-layer, impedance spectroscopy, electronic tongue
Procedia PDF Downloads 25225577 The Validity of Integrating the Concept of Servant Leadership in the Discourse of Poverty Eradication
Authors: Ocen Walter Onen
Abstract:
In 2018, the World Bank reported that approximately 8.6% of the global population was languishing in multidimensional poverty. This reality both challenged and motivated the research on the topic above. This research critically examined the validity of integrating the concept of servant leadership into the discourse of poverty eradication. The researcher applied documentary research methodology. Therefore, relevant literature, both printed and electronic, was analyzed, and desired data were obtained to enrich the discussion. The main finding from the research shows that; the concept of ‘servant leadership’, despite being paradoxical in nature, has the necessary potential to accelerate the effort of eliminating multidimensional poverty in any given context. Based on that, the researcher recommended that; state-actors, multi-national corporations, development organizations such as the United Nations, and other agencies working to make poverty history in our generation should both prioritize and promotes the integration of the concept of servant leadership in their policies’ formation, organizational leadership and management, and project design, implementation and evaluation of poverty-eradication initiatives.Keywords: multidimensional poverty, poverty eradication, servant leadership, United Nations
Procedia PDF Downloads 8025576 Classifying Students for E-Learning in Information Technology Course Using ANN
Authors: Sirilak Areerachakul, Nat Ployong, Supayothin Na Songkla
Abstract:
This research’s objective is to select the model with most accurate value by using Neural Network Technique as a way to filter potential students who enroll in IT course by electronic learning at Suan Suanadha Rajabhat University. It is designed to help students selecting the appropriate courses by themselves. The result showed that the most accurate model was 100 Folds Cross-validation which had 73.58% points of accuracy.Keywords: artificial neural network, classification, students, e-learning
Procedia PDF Downloads 42625575 The Various Legal Dimensions of Genomic Data
Authors: Amy Gooden
Abstract:
When human genomic data is considered, this is often done through only one dimension of the law, or the interplay between the various dimensions is not considered, thus providing an incomplete picture of the legal framework. This research considers and analyzes the various dimensions in South African law applicable to genomic sequence data – including property rights, personality rights, and intellectual property rights. The effective use of personal genomic sequence data requires the acknowledgement and harmonization of the rights applicable to such data.Keywords: artificial intelligence, data, law, genomics, rights
Procedia PDF Downloads 13825574 Big Brain: A Single Database System for a Federated Data Warehouse Architecture
Authors: X. Gumara Rigol, I. Martínez de Apellaniz Anzuola, A. Garcia Serrano, A. Franzi Cros, O. Vidal Calbet, A. Al Maruf
Abstract:
Traditional federated architectures for data warehousing work well when corporations have existing regional data warehouses and there is a need to aggregate data at a global level. Schibsted Media Group has been maturing from a decentralised organisation into a more globalised one and needed to build both some of the regional data warehouses for some brands at the same time as the global one. In this paper, we present the architectural alternatives studied and why a custom federated approach was the notable recommendation to go further with the implementation. Although the data warehouses are logically federated, the implementation uses a single database system which presented many advantages like: cost reduction and improved data access to global users allowing consumers of the data to have a common data model for detailed analysis across different geographies and a flexible layer for local specific needs in the same place.Keywords: data integration, data warehousing, federated architecture, Online Analytical Processing (OLAP)
Procedia PDF Downloads 23625573 Graphene-reinforced Metal-organic Framework Derived Cobalt Sulfide/Carbon Nanocomposites as Efficient Multifunctional Electrocatalysts
Authors: Yongde Xia, Laicong Deng, Zhuxian Yang
Abstract:
Developing cost-effective electrocatalysts for oxygen reduction reaction (ORR), oxygen evolution reaction (OER) and hydrogen evolution reaction (HER) is vital in energy conversion and storage applications. Herein, we report a simple method for the synthesis of graphene-reinforced cobalt sulfide/carbon nanocomposites and the evaluation of their electrocatalytic performance for typical electrocatalytic reactions. Nanocomposites of cobalt sulfide embedded in N, S co-doped porous carbon and graphene (CoS@C/Graphene) were generated via simultaneous sulfurization and carbonization of one-pot synthesized graphite oxide-ZIF-67 precursors. The obtained CoS@C/Graphene nanocomposite was characterized by X-ray diffraction, Raman spectroscopy, Thermogravimetric analysis-Mass spectroscopy, Scanning electronic microscopy, Transmission electronic microscopy, X-ray photoelectron spectroscopy and gas sorption. It was found that cobalt sulfide nanoparticles were homogenously dispersed in the in-situ formed N, S co-doped porous carbon/Graphene matrix. The CoS@C/10Graphene composite not only shows excellent electrocatalytic activity toward ORR with high onset potential of 0.89 V, four-electron pathway and superior durability of maintaining 98% current after continuously running for around 5 hours, but also exhibits good performance for OER and HER, due to the improved electrical conductivity, increased catalytic active sites and connectivity between the electrocatalytic active cobalt sulfide and the carbon matrix. This work offers a new approach for the development of novel multifunctional nanocomposites for the next generation of energy conversion and storage applications.Keywords: MOF derivative, graphene, electrocatalyst, oxygen reduction reaction, oxygen evolution reaction, hydrogen evolution reaction
Procedia PDF Downloads 5025572 A Review Paper on Data Mining and Genetic Algorithm
Authors: Sikander Singh Cheema, Jasmeen Kaur
Abstract:
In this paper, the concept of data mining is summarized and its one of the important process i.e KDD is summarized. The data mining based on Genetic Algorithm is researched in and ways to achieve the data mining Genetic Algorithm are surveyed. This paper also conducts a formal review on the area of data mining tasks and genetic algorithm in various fields.Keywords: data mining, KDD, genetic algorithm, descriptive mining, predictive mining
Procedia PDF Downloads 59125571 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring
Authors: Seung-Lock Seo
Abstract:
This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.Keywords: data mining, process data, monitoring, safety, industrial processes
Procedia PDF Downloads 40125570 A Framework for Incorporating Non-Linear Degradation of Conductive Adhesive in Environmental Testing
Authors: Kedar Hardikar, Joe Varghese
Abstract:
Conductive adhesives have found wide-ranging applications in electronics industry ranging from fixing a defective conductor on printed circuit board (PCB) attaching an electronic component in an assembly to protecting electronics components by the formation of “Faraday Cage.” The reliability requirements for the conductive adhesive vary widely depending on the application and expected product lifetime. While the conductive adhesive is required to maintain the structural integrity, the electrical performance of the associated sub-assembly can be affected by the degradation of conductive adhesive. The degradation of the adhesive is dependent upon the highly varied use case. The conventional approach to assess the reliability of the sub-assembly involves subjecting it to the standard environmental test conditions such as high-temperature high humidity, thermal cycling, high-temperature exposure to name a few. In order to enable projection of test data and observed failures to predict field performance, systematic development of an acceleration factor between the test conditions and field conditions is crucial. Common acceleration factor models such as Arrhenius model are based on rate kinetics and typically rely on an assumption of linear degradation in time for a given condition and test duration. The application of interest in this work involves conductive adhesive used in an electronic circuit of a capacitive sensor. The degradation of conductive adhesive in high temperature and humidity environment is quantified by the capacitance values. Under such conditions, the use of established models such as Hallberg-Peck model or Eyring Model to predict time to failure in the field typically relies on linear degradation rate. In this particular case, it is seen that the degradation is nonlinear in time and exhibits a square root t dependence. It is also shown that for the mechanism of interest, the presence of moisture is essential, and the dominant mechanism driving the degradation is the diffusion of moisture. In this work, a framework is developed to incorporate nonlinear degradation of the conductive adhesive for the development of an acceleration factor. This method can be extended to applications where nonlinearity in degradation rate can be adequately characterized in tests. It is shown that depending on the expected product lifetime, the use of conventional linear degradation approach can overestimate or underestimate the field performance. This work provides guidelines for suitability of linear degradation approximation for such varied applicationsKeywords: conductive adhesives, nonlinear degradation, physics of failure, acceleration factor model.
Procedia PDF Downloads 13525569 Impact of Obesity on Outcomes in Breast Reconstruction: A Systematic Review and Meta-Analysis
Authors: Adriana C. Panayi, Riaz A. Agha, Brady A. Sieber, Dennis P. Orgill
Abstract:
Background: Increased rates of both breast cancer and obesity have resulted in more women seeking breast reconstruction. These women may be at increased risk for perioperative complications. A systematic review was conducted to assess the outcomes in obese women who have undergone breast reconstruction following mastectomy. Methods: Cochrane, PUBMED and EMBASE electronic databases were screened and data was extracted from included studies. The clinical outcomes assessed were surgical complications, medical complications, length of postoperative hospital stay, reoperation rate and patient satisfaction. Results: 33 studies met the inclusion criteria for the review and 29 provided enough data to be included in the meta-analysis (71368 patients, 20061 of which were obese). Obese women were 2.3 times more likely to experience surgical complications (95 percent CI 2.19 to 2.39; P < 0.00001), 2.8 times more likely to have medical complications (95 percent CI 2.41 to 3.26; P < 0.00001) and had a 1.9 times higher risk of reoperation (95 percent CI 1.75 to 2.07; P < 0.00001). The most common complication, wound dehiscence, was 2.5 times more likely in obese women (95 percent CI 1.80 to 3.52; P < 0.00001). Sensitivity analysis confirmed that obese women were more likely to experience surgical complications (RR 2.36, 95% CI 2.22–2.52; P < 0.00001). Conclusions: This study provides evidence that obesity increases the risk of complications in both implant and autologous reconstruction. Additional prospective and observational studies are needed to determine if weight reduction prior to reconstruction reduces the perioperative risks associated with obesity.Keywords: autologous reconstruction, breast cancer, breast reconstruction, literature review, obesity, oncology, prosthetic reconstruction
Procedia PDF Downloads 30825568 Fueling Efficient Reporting And Decision-Making In Public Health With Large Data Automation In Remote Areas, Neno Malawi
Authors: Wiseman Emmanuel Nkhomah, Chiyembekezo Kachimanga, Julia Huggins, Fabien Munyaneza
Abstract:
Background: Partners In Health – Malawi introduced one of Operational Researches called Primary Health Care (PHC) Surveys in 2020, which seeks to assess progress of delivery of care in the district. The study consists of 5 long surveys, namely; Facility assessment, General Patient, Provider, Sick Child, Antenatal Care (ANC), primarily conducted in 4 health facilities in Neno district. These facilities include Neno district hospital, Dambe health centre, Chifunga and Matope. Usually, these annual surveys are conducted from January, and the target is to present final report by June. Once data is collected and analyzed, there are a series of reviews that take place before reaching final report. In the first place, the manual process took over 9 months to present final report. Initial findings reported about 76.9% of the data that added up when cross-checked with paper-based sources. Purpose: The aim of this approach is to run away from manually pulling the data, do fresh analysis, and reporting often associated not only with delays in reporting inconsistencies but also with poor quality of data if not done carefully. This automation approach was meant to utilize features of new technologies to create visualizations, reports, and dashboards in Power BI that are directly fished from the data source – CommCare hence only require a single click of a ‘refresh’ button to have the updated information populated in visualizations, reports, and dashboards at once. Methodology: We transformed paper-based questionnaires into electronic using CommCare mobile application. We further connected CommCare Mobile App directly to Power BI using Application Program Interface (API) connection as data pipeline. This provided chance to create visualizations, reports, and dashboards in Power BI. Contrary to the process of manually collecting data in paper-based questionnaires, entering them in ordinary spreadsheets, and conducting analysis every time when preparing for reporting, the team utilized CommCare and Microsoft Power BI technologies. We utilized validations and logics in CommCare to capture data with less errors. We utilized Power BI features to host the reports online by publishing them as cloud-computing process. We switched from sharing ordinary report files to sharing the link to potential recipients hence giving them freedom to dig deep into extra findings within Power BI dashboards and also freedom to export to any formats of their choice. Results: This data automation approach reduced research timelines from the initial 9 months’ duration to 5. It also improved the quality of the data findings from the original 76.9% to 98.9%. This brought confidence to draw conclusions from the findings that help in decision-making and gave opportunities for further researches. Conclusion: These results suggest that automating the research data process has the potential of reducing overall amount of time spent and improving the quality of the data. On this basis, the concept of data automation should be taken into serious consideration when conducting operational research for efficiency and decision-making.Keywords: reporting, decision-making, power BI, commcare, data automation, visualizations, dashboards
Procedia PDF Downloads 11625567 Knowledge and Information Sharing in the Opinion of the Polish Academic Community
Authors: Marzena Świgoń
Abstract:
The purpose of this paper is to describe the perceptions of knowledge and information sharing by the Polish academic community. An electronic questionnaire was used to gather opinions of respondents. The presented results are a part of the findings of empirical studies carried out amongst academics from various types of universities and academia located throughout Poland.Keywords: academics, information sharing, knowledge sharing, scholarly communication
Procedia PDF Downloads 41425566 A Comparative Study of the Proposed Models for the Components of the National Health Information System
Authors: M. Ahmadi, Sh. Damanabi, F. Sadoughi
Abstract:
National Health Information System plays an important role in ensuring timely and reliable access to Health information which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, by using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system for better planning and management influential factors of performance seems necessary, therefore, in this study, different attitudes towards components of this system are explored comparatively. Methods: This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process, and output. In this context, search for information using library resources and internet search were conducted and data analysis was expressed using comparative tables and qualitative data. Results: The findings showed that there are three different perspectives presenting the components of national health information system, Lippeveld, Sauerborn, and Bodart Model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008 and Gattini’s 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities, and equipment. In addition, in the ‘process’ section from three models, we pointed up the actions ensuring the quality of health information system and in output section, except Lippeveld Model, two other models consider information products, usage and distribution of information as components of the national health information system. Conclusion: The results showed that all the three models have had a brief discussion about the components of health information in input section. However, Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process, and output.Keywords: National Health Information System, components of the NHIS, Lippeveld Model
Procedia PDF Downloads 421