Search results for: data repair
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24962

Search results for: data repair

24632 Outcome of Comparison between Partial Thickness Skin Graft Harvesting from Scalp and Lower Limb for Scalp Defect: A Clinical Trial Study

Authors: Mahdi Eskandarlou, Mehrdad Taghipour

Abstract:

Background: Partial-thickness skin graft is the cornerstone for scalp defect repair. Routine donor sites include abdomen, thighs, and buttocks. Given the potential side effects following harvesting from these sites and the potential advantages of harvesting from scalp (broad surface, rapid healing, and better cosmetics results), this study is trying to compare the outcomes of graft harvesting from scalp and lower limb. Methods: This clinical trial is conducted among a sample number of 40 partial thickness graft candidates (20 case and 20 control group) with scalp defect presenting to plastic surgery clinic at Besat Hospital during the time period between 2018 and 2019. Sampling was done by simple randomization using random digit table. Data gathering was performed using a designated checklist. The donor site in case group and control group was scalp and lower limb, respectively. The resultant data were analyzed using chi-squared and t-test and SPPS version 21 (SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp). Results: Of the total 40 patients participating in this study, 28 patients (70%) were male, and 12 (30%) were female with and mean age of 63.62 ± 09.73 years. Hypertension and diabetes mellitus were the most common comorbidities among patients with basal cell carcinoma (BCC) and trauma being the most common etiology for the defects. There was a statistically meaningful relationship between two groups regarding the etiology of defect (P=0.02). The most common anatomic location of defect for case and control groups was temporal and parietal, respectively. Most of the defects were deep to galea zone. The mean diameter of defect was 24.28 ± 45.37 mm for all of the patients. The difference between diameter of defect in both groups was statistically meaningful, while no such difference between graft diameter was seen. The graft 'Take' was completely successful in both groups according to evaluations. The level of postoperative pain was lower in the case group compared to the control according to VAS scale, and the satisfaction was higher in them per Likert scale. Conclusion: Scalp can safely be used as donor site for skin graft to be used for scalp defects, which is associated with better results and lower complication rates compared to other donor sites.

Keywords: donor site, leg, partial-thickness graft, scalp

Procedia PDF Downloads 137
24631 Arthroscopic Superior Capsular Reconstruction Using the Long Head of the Biceps Tendon (LHBT)

Authors: Ho Sy Nam, Tang Ha Nam Anh

Abstract:

Background: Rotator cuff tears are a common problem in the aging population. The prevalence of massive rotator cuff tears varies in some studies from 10% to 40%. Of irreparable rotator cuff tears (IRCTs), which are mostly associated with massive tear size, 79% are estimated to have recurrent tears after surgical repair. Recent studies have shown that superior capsule reconstruction (SCR) in massive rotator cuff tears can be an efficient technique with optimistic clinical scores and preservation of stable glenohumeral stability. Superior capsule reconstruction techniques most commonly use either fascia lata autograft or dermal allograft, both of which have their own benefits and drawbacks (such as the potential for donor site issues, allergic reactions, and high cost). We propose a simple technique for superior capsule reconstruction that involves using the long head of the biceps tendon as a local autograft; therefore, the comorbidities related to graft harvesting are eliminated. The long head of the biceps tendon proximal portion is relocated to the footprint and secured as the SCR, serving to both stabilize the glenohumeral joint and maintain vascular supply to aid healing. Objective: The purpose of this study is to assess the clinical outcomes of patients with large to massive RCTs treated by SCR using LHBT. Materials and methods: A study was performed of consecutive patients with large to massive RCTs who were treated by SCR using LHBT between January 2022 and December 2022. We use one double-loaded suture anchor to secure the long head of the biceps to the middle of the footprint. Two more anchors are used to repair the rotator cuff using a single-row technique, which is placed anteriorly and posteriorly on the lateral side of the previously transposed LHBT. Results: The 3 men and 5 women had an average age of 61.25 years (range 48 to 76 years) at the time of surgery. The average follow-up was 8.2 months (6 to 10 months) after surgery. The average preoperative ASES was 45.8, and the average postoperative ASES was 85.83. The average postoperative UCLA score was 29.12. VAS score was improved from 5.9 to 1.12. The mean preoperative ROM of forward flexion and external rotation of the shoulder was 720 ± 160 and 280 ± 80, respectively. The mean postoperative ROM of forward flexion and external rotation were 1310 ± 220 and 630 ± 60, respectively. There were no cases of progression of osteoarthritis or rotator cuff muscle atrophy. Conclusion: SCR using LHBT is considered a treatment option for patients with large or massive RC tears. It can restore superior glenohumeral stability and function of the shoulder joint and can be an effective procedure for selected patients, helping to avoid progression to cuff tear arthropathy.

Keywords: superior capsule reconstruction, large or massive rotator cuff tears, the long head of the biceps, stabilize the glenohumeral joint

Procedia PDF Downloads 67
24630 Monitoring of Wound Healing Through Structural and Functional Mechanisms Using Photoacoustic Imaging Modality

Authors: Souradip Paul, Arijit Paramanick, M. Suheshkumar Singh

Abstract:

Traumatic injury is the leading worldwide health problem. Annually, millions of surgical wounds are created for the sake of routine medical care. The healing of these unintended injuries is always monitored based on visual inspection. The maximal restoration of tissue functionality remains a significant concern of clinical care. Although minor injuries heal well with proper care and medical treatment, large injuries negatively influence various factors (vasculature insufficiency, tissue coagulation) and cause poor healing. Demographically, the number of people suffering from severe wounds and impaired healing conditions is burdensome for both human health and the economy. An incomplete understanding of the functional and molecular mechanism of tissue healing often leads to a lack of proper therapies and treatment. Hence, strong and promising medical guidance is necessary for monitoring the tissue regeneration processes. Photoacoustic imaging (PAI), is a non-invasive, hybrid imaging modality that can provide a suitable solution in this regard. Light combined with sound offers structural, functional and molecular information from the higher penetration depth. Therefore, molecular and structural mechanisms of tissue repair will be readily observable in PAI from the superficial layer and in the deep tissue region. Blood vessel formation and its growth is an essential tissue-repairing components. These vessels supply nutrition and oxygen to the cell in the wound region. Angiogenesis (formation of new capillaries from existing blood vessels) contributes to new blood vessel formation during tissue repair. The betterment of tissue healing directly depends on angiogenesis. Other optical microscopy techniques can visualize angiogenesis in micron-scale penetration depth but are unable to provide deep tissue information. PAI overcomes this barrier due to its unique capability. It is ideally suited for deep tissue imaging and provides the rich optical contrast generated by hemoglobin in blood vessels. Hence, an early angiogenesis detection method provided by PAI leads to monitoring the medical treatment of the wound. Along with functional property, mechanical property also plays a key role in tissue regeneration. The wound heals through a dynamic series of physiological events like coagulation, granulation tissue formation, and extracellular matrix (ECM) remodeling. Therefore tissue elasticity changes, can be identified using non-contact photoacoustic elastography (PAE). In a nutshell, angiogenesis and biomechanical properties are both critical parameters for tissue healing and these can be characterized in a single imaging modality (PAI).

Keywords: PAT, wound healing, tissue coagulation, angiogenesis

Procedia PDF Downloads 90
24629 Queueing Modeling of M/G/1 Fault Tolerant System with Threshold Recovery and Imperfect Coverage

Authors: Madhu Jain, Rakesh Kumar Meena

Abstract:

This paper investigates a finite M/G/1 fault tolerant multi-component machining system. The system incorporates the features such as standby support, threshold recovery and imperfect coverage make the study closer to real time systems. The performance prediction of M/G/1 fault tolerant system is carried out using recursive approach by treating remaining service time as a supplementary variable. The numerical results are presented to illustrate the computational tractability of analytical results by taking three different service time distributions viz. exponential, 3-stage Erlang and deterministic. Moreover, the cost function is constructed to determine the optimal choice of system descriptors to upgrading the system.

Keywords: fault tolerant, machine repair, threshold recovery policy, imperfect coverage, supplementary variable technique

Procedia PDF Downloads 280
24628 Towards Automated Remanufacturing of Marine and Offshore Engineering Components

Authors: Aprilia, Wei Liang Keith Nguyen, Shu Beng Tor, Gerald Gim Lee Seet, Chee Kai Chua

Abstract:

Automated remanufacturing process is of great interest in today’s marine and offshore industry. Most of the current remanufacturing processes are carried out manually and hence they are error prone, labour-intensive and costly. In this paper, a conceptual framework for automated remanufacturing is presented. This framework involves the integration of 3D non-contact digitization, adaptive surface reconstruction, additive manufacturing and machining operation. Each operation is operated and interconnected automatically as one system. The feasibility of adaptive surface reconstruction on marine and offshore engineering components is also discussed. Several engineering components were evaluated and the results showed that this proposed system is feasible. Conclusions are drawn and further research work is discussed.

Keywords: adaptive surface reconstruction, automated remanufacturing, automatic repair, reverse engineering

Procedia PDF Downloads 315
24627 Data Poisoning Attacks on Federated Learning and Preventive Measures

Authors: Beulah Rani Inbanathan

Abstract:

In the present era, it is vivid from the numerous outcomes that data privacy is being compromised in various ways. Machine learning is one technology that uses the centralized server, and then data is given as input which is being analyzed by the algorithms present on this mentioned server, and hence outputs are predicted. However, each time the data must be sent by the user as the algorithm will analyze the input data in order to predict the output, which is prone to threats. The solution to overcome this issue is federated learning, where the models alone get updated while the data resides on the local machine and does not get exchanged with the other local models. Nevertheless, even on these local models, there are chances of data poisoning, and it is crystal clear from various experiments done by many people. This paper delves into many ways where data poisoning occurs and the many methods through which it is prevalent that data poisoning still exists. It includes the poisoning attacks on IoT devices, Edge devices, Autoregressive model, and also, on Industrial IoT systems and also, few points on how these could be evadible in order to protect our data which is personal, or sensitive, or harmful when exposed.

Keywords: data poisoning, federated learning, Internet of Things, edge computing

Procedia PDF Downloads 75
24626 Design and Fabrication of a Scaffold with Appropriate Features for Cartilage Tissue Engineering

Authors: S. S. Salehi, A. Shamloo

Abstract:

Poor ability of cartilage tissue when experiencing a damage leads scientists to use tissue engineering as a reliable and effective method for regenerating or replacing damaged tissues. An artificial tissue should have some features such as biocompatibility, biodegradation and, enough mechanical properties like the original tissue. In this work, a composite hydrogel is prepared by using natural and synthetic materials that has high porosity. Mechanical properties of different combinations of polymers such as modulus of elasticity were tested, and a hydrogel with good mechanical properties was selected. Bone marrow derived mesenchymal stem cells were also seeded into the pores of the sponge, and the results showed the adhesion and proliferation of cells within the hydrogel after one month. In comparison with previous works, this study offers a new and efficient procedure for the fabrication of cartilage like tissue and further cartilage repair.

Keywords: cartilage tissue engineering, hydrogel, mechanical strength, mesenchymal stem cell

Procedia PDF Downloads 285
24625 Evaluation of Wound Healing Activity of Phlomis bovei De Noe in Wistar Albino Rats

Authors: W. Khitri, J. Zenaki, A. Abi, N. Lachgueur, A. Lardjem

Abstract:

Healing is a biological phenomenon that is automatically and immediately implemented by the body that is able to repair the physical damage of all tissues except nerve cells. Lot of medicinal plants is used for the treatment of a wound. Our ethnobotanical study has identified 19 species and 13 families of plants used in traditional medicine in Oran-Algeria for their healing activities. The Phlomis bovei De Noe was the species most recommended by herbalists. Its phytochemical study revealed different secondary metabolites such as terpenes, tannins, saponins and mucilage. The evaluation of the healing activity of Phlomis bovei in wistar albinos rats by excision wound model showed a significant amelioration with 5 % increase of the surface healing compared to the control group and a gain of three days of epithelialization time with a scar histologically better.

Keywords: Phlomis Bovei De Noe, ethnobanical study, wound healing, wistar albino rats

Procedia PDF Downloads 434
24624 Assessment of the Properties of Microcapsules with Different Polymeric Shells Containing a Reactive Agent for their Suitability in Thermoplastic Self-healing Materials

Authors: Małgorzata Golonka, Jadwiga Laska

Abstract:

Self-healing polymers are one of the most investigated groups of smart materials. As materials engineering has recently focused on the design, production and research of modern materials and future technologies, researchers are looking for innovations in structural, construction and coating materials. Based on available scientific articles, it can be concluded that most of the research focuses on the self-healing of cement, concrete, asphalt and anticorrosion resin coatings. In our study, a method of obtaining and testing the properties of several types of microcapsules for use in self-healing polymer materials was developed. A method to obtain microcapsules exhibiting various mechanical properties, especially compressive strength was developed. The effect was achieved by using various polymer materials to build the shell: urea-formaldehyde resin (UFR), melamine-formaldehyde resin (MFR), melamine-urea-formaldehyde resin (MUFR). Dicyclopentadiene (DCPD) was used as the core material due to the possibility of its polymerization according to the ring-opening olefin metathesis (ROMP) mechanism in the presence of a solid Grubbs catalyst showing relatively high chemical and thermal stability. The ROMP of dicyclopentadiene leads to a polymer with high impact strength, high thermal resistance, good adhesion to other materials and good chemical and environmental resistance, so it is potentially a very promising candidate for the self-healing of materials. The capsules were obtained by condensation polymerization of formaldehyde with urea, melamine or copolymerization with urea and melamine in situ in water dispersion, with different molar ratios of formaldehyde, urea and melamine. The fineness of the organic phase dispersed in water, and consequently the size of the microcapsules, was regulated by the stirring speed. In all cases, to establish such synthesis conditions as to obtain capsules with appropriate mechanical strength. The microcapsules were characterized by determining the diameters and their distribution and measuring the shell thickness using digital optical microscopy and scanning electron microscopy, as well as confirming the presence of the active substance in the core by FTIR and SEM. Compression tests were performed to determine mechanical strength of the microcapsules. The highest repeatability of microcapsule properties was obtained for UFR resin, while the MFR resin had the best mechanical properties. The encapsulation efficiency of MFR was much lower compared to UFR, though. Therefore, capsules with a MUFR shell may be the optimal solution. The chemical reaction between the active substance present in the capsule core and the catalyst placed outside the capsules was confirmed by FTIR spectroscopy. The obtained autonomous repair systems (microcapsules + catalyst) were introduced into polyethylene in the extrusion process and tested for the self-repair of the material.

Keywords: autonomic self-healing system, dicyclopentadiene, melamine-urea-formaldehyde resin, microcapsules, thermoplastic materials

Procedia PDF Downloads 27
24623 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 290
24622 Mobile Application Tool for Individual Maintenance Users on High-Rise Residential Buildings in South Korea

Authors: H. Cha, J. Kim, D. Kim, J. Shin, K. Lee

Abstract:

Since 1980's, the rapid economic growth resulted in so many aged apartment buildings in South Korea. Nevertheless, there is insufficient maintenance practice of buildings. In this study, to facilitate the building maintenance the authors classified the building defects into three levels according to their level of performance and developed a mobile application tool based on each level's appropriate feedback. The feedback structure consisted of 'Maintenance manual phase', 'Online feedback phase', 'Repair work phase of the specialty contractors'. In order to implement each phase the authors devised the necessary database for each phase and created a prototype system that can develop on its own. The authors expect that the building users can easily maintain their buildings by using this application.

Keywords: building defect, maintenance practice, mobile application, system algorithm

Procedia PDF Downloads 181
24621 Simulation and Hardware Implementation of Data Communication Between CAN Controllers for Automotive Applications

Authors: R. M. Kalayappan, N. Kathiravan

Abstract:

In automobile industries, Controller Area Network (CAN) is widely used to reduce the system complexity and inter-task communication. Therefore, this paper proposes the hardware implementation of data frame communication between one controller to other. The CAN data frames and protocols will be explained deeply, here. The data frames are transferred without any collision or corruption. The simulation is made in the KEIL vision software to display the data transfer between transmitter and receiver in CAN. ARM7 micro-controller is used to transfer data’s between the controllers in real time. Data transfer is verified using the CRO.

Keywords: control area network (CAN), automotive electronic control unit, CAN 2.0, industry

Procedia PDF Downloads 391
24620 Improving the Statistics Nature in Research Information System

Authors: Rajbir Cheema

Abstract:

In order to introduce an integrated research information system, this will provide scientific institutions with the necessary information on research activities and research results in assured quality. Since data collection, duplication, missing values, incorrect formatting, inconsistencies, etc. can arise in the collection of research data in different research information systems, which can have a wide range of negative effects on data quality, the subject of data quality should be treated with better results. This paper examines the data quality problems in research information systems and presents the new techniques that enable organizations to improve their quality of research information.

Keywords: Research information systems (RIS), research information, heterogeneous sources, data quality, data cleansing, science system, standardization

Procedia PDF Downloads 145
24619 Data Mining Meets Educational Analysis: Opportunities and Challenges for Research

Authors: Carla Silva

Abstract:

Recent development of information and communication technology enables us to acquire, collect, analyse data in various fields of socioeconomic – technological systems. Along with the increase of economic globalization and the evolution of information technology, data mining has become an important approach for economic data analysis. As a result, there has been a critical need for automated approaches to effective and efficient usage of massive amount of educational data, in order to support institutions to a strategic planning and investment decision-making. In this article, we will address data from several different perspectives and define the applied data to sciences. Many believe that 'big data' will transform business, government, and other aspects of the economy. We discuss how new data may impact educational policy and educational research. Large scale administrative data sets and proprietary private sector data can greatly improve the way we measure, track, and describe educational activity and educational impact. We also consider whether the big data predictive modeling tools that have emerged in statistics and computer science may prove useful in educational and furthermore in economics. Finally, we highlight a number of challenges and opportunities for future research.

Keywords: data mining, research analysis, investment decision-making, educational research

Procedia PDF Downloads 343
24618 A Method of Detecting the Difference in Two States of Brain Using Statistical Analysis of EEG Raw Data

Authors: Digvijaysingh S. Bana, Kiran R. Trivedi

Abstract:

This paper introduces various methods for the alpha wave to detect the difference between two states of brain. One healthy subject participated in the experiment. EEG was measured on the forehead above the eye (FP1 Position) with reference and ground electrode are on the ear clip. The data samples are obtained in the form of EEG raw data. The time duration of reading is of one minute. Various test are being performed on the alpha band EEG raw data.The readings are performed in different time duration of the entire day. The statistical analysis is being carried out on the EEG sample data in the form of various tests.

Keywords: electroencephalogram(EEG), biometrics, authentication, EEG raw data

Procedia PDF Downloads 452
24617 Efficient Moment Frame Structure

Authors: Mircea I. Pastrav, Cornelia Baera, Florea Dinu

Abstract:

A different concept for designing and detailing of reinforced concrete precast frame structures is analyzed in this paper. The new detailing of the joints derives from the special hybrid moment frame joints. The special reinforcements of this alternative detailing, named modified special hybrid joint, are bondless with respect to both column and beams. Full scale tests were performed on a plan model, which represents a part of 5 story structure, cropped in the middle of the beams and columns spans. Theoretical approach was developed, based on testing results on twice repaired model, subjected to lateral seismic type loading. Discussion regarding the modified special hybrid joint behavior and further on widening research needed concludes the presentation.

Keywords: modified hybrid joint, repair, seismic loading type, acceptance criteria

Procedia PDF Downloads 512
24616 3D Electrode Carrier and its Implications on Retinal Implants

Authors: Diego Luján Villarreal

Abstract:

Retinal prosthetic devices aim to repair some vision in visual impairment patients by stimulating electrically neural cells in the visual system. In this study, the 3D linear electrode carrier is presented. A simulation framework was developed by placing the 3D carrier 1 mm away from the fovea center at the highest-density cell. Cell stimulation is verified in COMSOL Multiphysics by developing a 3D computational model which includes the relevant retinal interface elements and dynamics of the voltage-gated ionic channels. Current distribution resulting from low threshold amplitudes produces a small volume equivalent to the volume confined by individual cells at the highest-density cell using small-sized electrodes. Delicate retinal tissue is protected by excessive charge density

Keywords: retinal prosthetic devices, visual devices, retinal implants., visual prosthetic devices

Procedia PDF Downloads 92
24615 A Study on Big Data Analytics, Applications and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, Healthcare, and business intelligence contain voluminous and incremental data, which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organization's decision-making strategy can be enhanced using big data analytics and applying different machine learning techniques and statistical tools on such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates on various frameworks in the process of Analysis using different machine-learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 69
24614 A Study on Big Data Analytics, Applications, and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, healthcare, and business intelligence contain voluminous and incremental data which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organisation decision-making strategy can be enhanced by using big data analytics and applying different machine learning techniques and statistical tools to such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates various frameworks in the process of analysis using different machine learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 81
24613 Improved K-Means Clustering Algorithm Using RHadoop with Combiner

Authors: Ji Eun Shin, Dong Hoon Lim

Abstract:

Data clustering is a common technique used in data analysis and is used in many applications, such as artificial intelligence, pattern recognition, economics, ecology, psychiatry and marketing. K-means clustering is a well-known clustering algorithm aiming to cluster a set of data points to a predefined number of clusters. In this paper, we implement K-means algorithm based on MapReduce framework with RHadoop to make the clustering method applicable to large scale data. RHadoop is a collection of R packages that allow users to manage and analyze data with Hadoop. The main idea is to introduce a combiner as a function of our map output to decrease the amount of data needed to be processed by reducers. The experimental results demonstrated that K-means algorithm using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also showed that our K-means algorithm using RHadoop with combiner was faster than regular algorithm without combiner as the size of data set increases.

Keywords: big data, combiner, K-means clustering, RHadoop

Procedia PDF Downloads 418
24612 Framework for Integrating Big Data and Thick Data: Understanding Customers Better

Authors: Nikita Valluri, Vatcharaporn Esichaikul

Abstract:

With the popularity of data-driven decision making on the rise, this study focuses on providing an alternative outlook towards the process of decision-making. Combining quantitative and qualitative methods rooted in the social sciences, an integrated framework is presented with a focus on delivering a much more robust and efficient approach towards the concept of data-driven decision-making with respect to not only Big data but also 'Thick data', a new form of qualitative data. In support of this, an example from the retail sector has been illustrated where the framework is put into action to yield insights and leverage business intelligence. An interpretive approach to analyze findings from both kinds of quantitative and qualitative data has been used to glean insights. Using traditional Point-of-sale data as well as an understanding of customer psychographics and preferences, techniques of data mining along with qualitative methods (such as grounded theory, ethnomethodology, etc.) are applied. This study’s final goal is to establish the framework as a basis for providing a holistic solution encompassing both the Big and Thick aspects of any business need. The proposed framework is a modified enhancement in lieu of traditional data-driven decision-making approach, which is mainly dependent on quantitative data for decision-making.

Keywords: big data, customer behavior, customer experience, data mining, qualitative methods, quantitative methods, thick data

Procedia PDF Downloads 143
24611 Performance Evaluation and Cost Analysis of Standby Systems

Authors: Mohammed A. Hajeeh

Abstract:

Pumping systems are an integral part of water desalination plants, their effective functioning is vital for the operation of a plant. In this research work, the reliability and availability of pressurized pumps in a reverse osmosis desalination plant are studied with the objective of finding configurations that provides optimal performance. Six configurations of a series system with different number of warm and cold standby components were examined. Closed form expressions for the mean time to failure (MTTF) and the long run availability are derived and compared under the assumption that the time between failures and repair times of the primary and standby components are exponentially distributed. Moreover, a cost/ benefit analysis is conducted in order to identify a configuration with the best performance and least cost. It is concluded that configurations with cold standby components are preferable especially when the pumps are of the size.

Keywords: availability, cost/benefit, mean time to failure, pumps

Procedia PDF Downloads 271
24610 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: text mining, topic extraction, independent, incremental, independent component analysis

Procedia PDF Downloads 293
24609 Open Data for e-Governance: Case Study of Bangladesh

Authors: Sami Kabir, Sadek Hossain Khoka

Abstract:

Open Government Data (OGD) refers to all data produced by government which are accessible in reusable way by common people with access to Internet and at free of cost. In line with “Digital Bangladesh” vision of Bangladesh government, the concept of open data has been gaining momentum in the country. Opening all government data in digital and customizable format from single platform can enhance e-governance which will make government more transparent to the people. This paper presents a well-in-progress case study on OGD portal by Bangladesh Government in order to link decentralized data. The initiative is intended to facilitate e-service towards citizens through this one-stop web portal. The paper further discusses ways of collecting data in digital format from relevant agencies with a view to making it publicly available through this single point of access. Further, possible layout of this web portal is presented.

Keywords: e-governance, one-stop web portal, open government data, reusable data, web of data

Procedia PDF Downloads 338
24608 Resource Framework Descriptors for Interestingness in Data

Authors: C. B. Abhilash, Kavi Mahesh

Abstract:

Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.

Keywords: RDF, interestingness, knowledge base, semantic data

Procedia PDF Downloads 147
24607 Data Mining Practices: Practical Studies on the Telecommunication Companies in Jordan

Authors: Dina Ahmad Alkhodary

Abstract:

This study aimed to investigate the practices of Data Mining on the telecommunication companies in Jordan, from the viewpoint of the respondents. In order to achieve the goal of the study, and test the validity of hypotheses, the researcher has designed a questionnaire to collect data from managers and staff members from main department in the researched companies. The results shows improvements stages of the telecommunications companies towered Data Mining.

Keywords: data, mining, development, business

Procedia PDF Downloads 479
24606 Design of a Low-Cost, Portable, Sensor Device for Longitudinal, At-Home Analysis of Gait and Balance

Authors: Claudia Norambuena, Myissa Weiss, Maria Ruiz Maya, Matthew Straley, Elijah Hammond, Benjamin Chesebrough, David Grow

Abstract:

The purpose of this project is to develop a low-cost, portable sensor device that can be used at home for long-term analysis of gait and balance abnormalities. One area of particular concern involves the asymmetries in movement and balance that can accompany certain types of injuries and/or the associated devices used in the repair and rehabilitation process (e.g. the use of splints and casts) which can often increase chances of falls and additional injuries. This device has the capacity to monitor a patient during the rehabilitation process after injury or operation, increasing the patient’s access to healthcare while decreasing the number of visits to the patient’s clinician. The sensor device may thereby improve the quality of the patient’s care, particularly in rural areas where access to the clinician could be limited, while simultaneously decreasing the overall cost associated with the patient’s care. The device consists of nine interconnected accelerometer/ gyroscope/compass chips (9-DOF IMU, Adafruit, New York, NY). The sensors attach to and are used to determine the orientation and acceleration of the patient’s lower abdomen, C7 vertebra (lower neck), L1 vertebra (middle back), anterior side of each thigh and tibia, and dorsal side of each foot. In addition, pressure sensors are embedded in shoe inserts with one sensor (ESS301, Tekscan, Boston, MA) beneath the heel and three sensors (Interlink 402, Interlink Electronics, Westlake Village, CA) beneath the metatarsal bones of each foot. These sensors measure the distribution of the weight applied to each foot as well as stride duration. A small microntroller (Arduino Mega, Arduino, Ivrea, Italy) is used to collect data from these sensors in a CSV file. MATLAB is then used to analyze the data and output the hip, knee, ankle, and trunk angles projected on the sagittal plane. An open-source program Processing is then used to generate an animation of the patient’s gait. The accuracy of the sensors was validated through comparison to goniometric measurements (±2° error). The sensor device was also shown to have sufficient sensitivity to observe various gait abnormalities. Several patients used the sensor device, and the data collected from each represented the patient’s movements. Further, the sensors were found to have the ability to observe gait abnormalities caused by the addition of a small amount of weight (4.5 - 9.1 kg) to one side of the patient. The user-friendly interface and portability of the sensor device will help to construct a bridge between patients and their clinicians with fewer necessary inpatient visits.

Keywords: biomedical sensing, gait analysis, outpatient, rehabilitation

Procedia PDF Downloads 273
24605 The Impact of System and Data Quality on Organizational Success in the Kingdom of Bahrain

Authors: Amal M. Alrayes

Abstract:

Data and system quality play a central role in organizational success, and the quality of any existing information system has a major influence on the effectiveness of overall system performance.Given the importance of system and data quality to an organization, it is relevant to highlight their importance on organizational performance in the Kingdom of Bahrain. This research aims to discover whether system quality and data quality are related, and to study the impact of system and data quality on organizational success. A theoretical model based on previous research is used to show the relationship between data and system quality, and organizational impact. We hypothesize, first, that system quality is positively associated with organizational impact, secondly that system quality is positively associated with data quality, and finally that data quality is positively associated with organizational impact. A questionnaire was conducted among public and private organizations in the Kingdom of Bahrain. The results show that there is a strong association between data and system quality, that affects organizational success.

Keywords: data quality, performance, system quality, Kingdom of Bahrain

Procedia PDF Downloads 477
24604 Cloud Computing in Data Mining: A Technical Survey

Authors: Ghaemi Reza, Abdollahi Hamid, Dashti Elham

Abstract:

Cloud computing poses a diversity of challenges in data mining operation arising out of the dynamic structure of data distribution as against the use of typical database scenarios in conventional architecture. Due to immense number of users seeking data on daily basis, there is a serious security concerns to cloud providers as well as data providers who put their data on the cloud computing environment. Big data analytics use compute intensive data mining algorithms (Hidden markov, MapReduce parallel programming, Mahot Project, Hadoop distributed file system, K-Means and KMediod, Apriori) that require efficient high performance processors to produce timely results. Data mining algorithms to solve or optimize the model parameters. The challenges that operation has to encounter is the successful transactions to be established with the existing virtual machine environment and the databases to be kept under the control. Several factors have led to the distributed data mining from normal or centralized mining. The approach is as a SaaS which uses multi-agent systems for implementing the different tasks of system. There are still some problems of data mining based on cloud computing, including design and selection of data mining algorithms.

Keywords: cloud computing, data mining, computing models, cloud services

Procedia PDF Downloads 463
24603 Cross-border Data Transfers to and from South Africa

Authors: Amy Gooden, Meshandren Naidoo

Abstract:

Genetic research and transfers of big data are not confined to a particular jurisdiction, but there is a lack of clarity regarding the legal requirements for importing and exporting such data. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.

Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa

Procedia PDF Downloads 117