Search results for: data reduction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28013

Search results for: data reduction

26843 A Review of Travel Data Collection Methods

Authors: Muhammad Awais Shafique, Eiji Hato

Abstract:

Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.

Keywords: computer, smartphone, telephone, travel survey

Procedia PDF Downloads 295
26842 Gypsum Composites with CDW as Raw Material

Authors: R. Santos Jiménez, A. San-Antonio-González, M. del Río Merino, M. González Cortina, C. Viñas Arrebola

Abstract:

On average, Europe generates around 890 million tons of construction and demolition waste (CDW) per year and only 50% of these CDW are recycled. This is far from the objectives determined in the European Directive for 2020 and aware of this situation, the European Countries are implementing national policies to prevent the waste that can be avoidable and to promote measures to increase recycling and recovering. In Spain, one of these measures has been the development of a CDW recycling guide for the manufacture of mortar, concrete, bricks and lightweight aggregates. However, there is still not enough information on the possibility of incorporating CDW materials in the manufacture of gypsum products. In view of the foregoing, the Universidad Politécnica de Madrid is creating a database with information on the possibility of incorporating CDW materials in the manufacture of gypsum products. The objective of this study is to improve this database by analysing the feasibility of incorporating two different CDW in a gypsum matrix: ceramic waste bricks (perforated brick and double hollow brick), and extruded polystyrene (XPS) waste. Results show that it is possible to incorporate up to 25% of ceramic waste and 4% of XPS waste over the weight of gypsum in a gypsum matrix. Furhtermore, with the addition of ceramic waste an 8% of surface hardness increase and a 25% of capillary water absorption reduction can be obtained. On the other hand, with the addition of XPS, a 26% reduction of density and a 37% improvement of thermal conductivity can be obtained.

Keywords: CDW, waste materials, ceramic waste, XPS, construction materials, gypsum

Procedia PDF Downloads 490
26841 The Effectiveness of Sulfate Reducing Bacteria in Minimizing Methane and Sludge Production from Palm Oil Mill Effluent (POME)

Authors: K. Abdul Halim, E. L. Yong

Abstract:

Palm oil industry is a major revenue earner in Malaysia, despite the growth of the industry is synonymous with a massive production of agro-industrial wastewater. Through the oil extraction processes, palm oil mill effluent (POME) contributes to the largest liquid wastes generated. Due to the high amount of organic compound, POME can cause inland water pollution if discharged untreated into the water course as well as affect the aquatic ecosystem. For more than 20 years, Malaysia adopted the conventional biological treatment known as lagoon system that apply biological treatment. Besides having difficulties in complying with the standard, a large build up area is needed and retention time is higher. Although anaerobic digester is more favorable, this process comes along with enormous volumes of sludge and methane gas, demanding attention from the mill operators. In order to reduce the sludge production, denitrifiers are to be removed first. Sulfate reducing bacteria has shown the capability to inhibit the growth of methanogens. This is expected to substantially reduce both the sludge and methane production in anaerobic digesters. In this paper, the effectiveness of sulfate reducing bacteria in minimizing sludge and methane will be examined.

Keywords: methane reduction, palm oil mill effluent, sludge minimization, sulfate reducing bacteria, sulfate reduction

Procedia PDF Downloads 418
26840 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain

Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami

Abstract:

To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. In the blockchain mechanism such as Bitcoin using PKI (Public Key Infrastructure), in order to confirm the identity of the company that has sent the data, the plaintext must be shared between the companies. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is a top secret. In this scenario, we show a implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.

Keywords: business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption

Procedia PDF Downloads 114
26839 Multivariate Assessment of Mathematics Test Scores of Students in Qatar

Authors: Ali Rashash Alzahrani, Elizabeth Stojanovski

Abstract:

Data on various aspects of education are collected at the institutional and government level regularly. In Australia, for example, students at various levels of schooling undertake examinations in numeracy and literacy as part of NAPLAN testing, enabling longitudinal assessment of such data as well as comparisons between schools and states within Australia. Another source of educational data collected internationally is via the PISA study which collects data from several countries when students are approximately 15 years of age and enables comparisons in the performance of science, mathematics and English between countries as well as ranking of countries based on performance in these standardised tests. As well as student and school outcomes based on the tests taken as part of the PISA study, there is a wealth of other data collected in the study including parental demographics data and data related to teaching strategies used by educators. Overall, an abundance of educational data is available which has the potential to be used to help improve educational attainment and teaching of content in order to improve learning outcomes. A multivariate assessment of such data enables multiple variables to be considered simultaneously and will be used in the present study to help develop profiles of students based on performance in mathematics using data obtained from the PISA study.

Keywords: cluster analysis, education, mathematics, profiles

Procedia PDF Downloads 109
26838 Tungsten-Based Powders Produced in Plasma Systems

Authors: Andrey V. Samokhin, Nikolay V. Alekseev, Mikhail A. Sinaiskii

Abstract:

The report presents the results of R&D of plasma-chemical production of W, W-Cu, W-Ni-Fe nanopowders as well as spherical micropowders of these compounds for their use in modern 3D printing technologies. Plasma-chemical synthesis of nanopowdersis based on the reduction of tungsten oxide compounds powders in a stream of hydrogen-containing low-temperature thermal plasma generated in an electric arc plasma torch. The synthesis of W-Cu and W-Ni-Fe nanocompositesiscarried out using the reduction of a mixture of the metal oxides. Using the synthesized tungsten-based nanocomposites powders, spherical composite micropowders with a submicron structure canbe manufactured by spray dryinggranulation of nanopowder suspension and subsequent densification and spheroidization of granules by melting in a low-temperature thermal plasma flow. The DC arc plasma systems are usedfor the synthesis of nanopowdersas well as for the spheroidization of microgranuls. Plasma systems have a capacity of up to 1 kg/h for nanopowder and up to 5 kg/h for spheroidized powder. All synthesized nanopowders consist of aggregated particles with sizes less than 100 nm, and nanoparticles of W-Cu and W-Ni-Fe composites have core (W) –shell (Cu or Ni-Fe) structures. The resulting dense spherical microparticles with a size of 20-60 microns have a submicron structure with a uniform distribution of metals over the particle volume. The produced tungsten-based nano- and spherical micropowderscan be used to develop new materials and manufacture products using advanced modern technologies.

Keywords: plasma, powders, production, tungsten-based

Procedia PDF Downloads 104
26837 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators

Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros

Abstract:

Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.

Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis

Procedia PDF Downloads 121
26836 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 128
26835 Facial Pose Classification Using Hilbert Space Filling Curve and Multidimensional Scaling

Authors: Mekamı Hayet, Bounoua Nacer, Benabderrahmane Sidahmed, Taleb Ahmed

Abstract:

Pose estimation is an important task in computer vision. Though the majority of the existing solutions provide good accuracy results, they are often overly complex and computationally expensive. In this perspective, we propose the use of dimensionality reduction techniques to address the problem of facial pose estimation. Firstly, a face image is converted into one-dimensional time series using Hilbert space filling curve, then the approach converts these time series data to a symbolic representation. Furthermore, a distance matrix is calculated between symbolic series of an input learning dataset of images, to generate classifiers of frontal vs. profile face pose. The proposed method is evaluated with three public datasets. Experimental results have shown that our approach is able to achieve a correct classification rate exceeding 97% with K-NN algorithm.

Keywords: machine learning, pattern recognition, facial pose classification, time series

Procedia PDF Downloads 336
26834 Canopy Temperature Acquired from Daytime and Nighttime Aerial Data as an Indicator of Trees’ Health Status

Authors: Agata Zakrzewska, Dominik Kopeć, Adrian Ochtyra

Abstract:

The growing number of new cameras, sensors, and research methods allow for a broader application of thermal data in remote sensing vegetation studies. The aim of this research was to check whether it is possible to use thermal infrared data with a spectral range (3.6-4.9 μm) obtained during the day and the night to assess the health condition of selected species of deciduous trees in an urban environment. For this purpose, research was carried out in the city center of Warsaw (Poland) in 2020. During the airborne data acquisition, thermal data, laser scanning, and orthophoto map images were collected. Synchronously with airborne data, ground reference data were obtained for 617 studied species (Acer platanoides, Acer pseudoplatanus, Aesculus hippocastanum, Tilia cordata, and Tilia × euchlora) in different health condition states. The results were as follows: (i) healthy trees are cooler than trees in poor condition and dying both in the daytime and nighttime data; (ii) the difference in the canopy temperatures between healthy and dying trees was 1.06oC of mean value on the nighttime data and 3.28oC of mean value on the daytime data; (iii) condition classes significantly differentiate on both daytime and nighttime thermal data, but only on daytime data all condition classes differed statistically significantly from each other. In conclusion, the aerial thermal data can be considered as an alternative to hyperspectral data, a method of assessing the health condition of trees in an urban environment. Especially data obtained during the day, which can differentiate condition classes better than data obtained at night. The method based on thermal infrared and laser scanning data fusion could be a quick and efficient solution for identifying trees in poor health that should be visually checked in the field.

Keywords: middle wave infrared, thermal imagery, tree discoloration, urban trees

Procedia PDF Downloads 100
26833 Frequency Decomposition Approach for Sub-Band Common Spatial Pattern Methods for Motor Imagery Based Brain-Computer Interface

Authors: Vitor M. Vilas Boas, Cleison D. Silva, Gustavo S. Mafra, Alexandre Trofino Neto

Abstract:

Motor imagery (MI) based brain-computer interfaces (BCI) uses event-related (de)synchronization (ERS/ ERD), typically recorded using electroencephalography (EEG), to translate brain electrical activity into control commands. To mitigate undesirable artifacts and noise measurements on EEG signals, methods based on band-pass filters defined by a specific frequency band (i.e., 8 – 30Hz), such as the Infinity Impulse Response (IIR) filters, are typically used. Spatial techniques, such as Common Spatial Patterns (CSP), are also used to estimate the variations of the filtered signal and extract features that define the imagined motion. The CSP effectiveness depends on the subject's discriminative frequency, and approaches based on the decomposition of the band of interest into sub-bands with smaller frequency ranges (SBCSP) have been suggested to EEG signals classification. However, despite providing good results, the SBCSP approach generally increases the computational cost of the filtering step in IM-based BCI systems. This paper proposes the use of the Fast Fourier Transform (FFT) algorithm in the IM-based BCI filtering stage that implements SBCSP. The goal is to apply the FFT algorithm to reduce the computational cost of the processing step of these systems and to make them more efficient without compromising classification accuracy. The proposal is based on the representation of EEG signals in a matrix of coefficients resulting from the frequency decomposition performed by the FFT, which is then submitted to the SBCSP process. The structure of the SBCSP contemplates dividing the band of interest, initially defined between 0 and 40Hz, into a set of 33 sub-bands spanning specific frequency bands which are processed in parallel each by a CSP filter and an LDA classifier. A Bayesian meta-classifier is then used to represent the LDA outputs of each sub-band as scores and organize them into a single vector, and then used as a training vector of an SVM global classifier. Initially, the public EEG data set IIa of the BCI Competition IV is used to validate the approach. The first contribution of the proposed method is that, in addition to being more compact, because it has a 68% smaller dimension than the original signal, the resulting FFT matrix maintains the signal information relevant to class discrimination. In addition, the results showed an average reduction of 31.6% in the computational cost in relation to the application of filtering methods based on IIR filters, suggesting FFT efficiency when applied in the filtering step. Finally, the frequency decomposition approach improves the overall system classification rate significantly compared to the commonly used filtering, going from 73.7% using IIR to 84.2% using FFT. The accuracy improvement above 10% and the computational cost reduction denote the potential of FFT in EEG signal filtering applied to the context of IM-based BCI implementing SBCSP. Tests with other data sets are currently being performed to reinforce such conclusions.

Keywords: brain-computer interfaces, fast Fourier transform algorithm, motor imagery, sub-band common spatial patterns

Procedia PDF Downloads 110
26832 Hybrid Genetic Approach for Solving Economic Dispatch Problems with Valve-Point Effect

Authors: Mohamed I. Mahrous, Mohamed G. Ashmawy

Abstract:

Hybrid genetic algorithm (HGA) is proposed in this paper to determine the economic scheduling of electric power generation over a fixed time period under various system and operational constraints. The proposed technique can outperform conventional genetic algorithms (CGAs) in the sense that HGA make it possible to improve both the quality of the solution and reduce the computing expenses. In contrast, any carefully designed GA is only able to balance the exploration and the exploitation of the search effort, which means that an increase in the accuracy of a solution can only occure at the sacrifice of convergent speed, and vice visa. It is unlikely that both of them can be improved simultaneously. The proposed hybrid scheme is developed in such a way that a simple GA is acting as a base level search, which makes a quick decision to direct the search towards the optimal region, and a local search method (pattern search technique) is next employed to do the fine tuning. The aim of the strategy is to achieve the cost reduction within a reasonable computing time. The effectiveness of the proposed hybrid technique is verified on two real public electricity supply systems with 13 and 40 generator units respectively. The simulation results obtained with the HGA for the two real systems are very encouraging with regard to the computational expenses and the cost reduction of power generation.

Keywords: genetic algorithms, economic dispatch, pattern search

Procedia PDF Downloads 424
26831 Amino Acid Based Biodegradable Poly (Ester-Amide)s and Their Potential Biomedical Applications as Drug Delivery Containers and Antibacterial

Authors: Nino Kupatadze, Tamar Memanishvili, Natia Ochkhikidze, David Tugushi, Zaal Kokaia, Ramaz Katsarava

Abstract:

Amino acid-based Biodegradable poly(ester-amide)s (PEAs) have gained considerable interest as a promising materials for numerous biomedical applications. These polymers reveal a high biocompatibility and easily form small particles suitable for delivery various biological, as well as elastic bio-erodible films serving as matrices for constructing antibacterial coatings. In the present work we have demonstrated a potential of the PEAs for two applications: 1. cell therapy for stroke as vehicles for delivery and sustained release of growth factors, 2. bactericidal coating as prevention biofilm and applicable in infected wound management. Stroke remains the main cause of adult disability with limited treatment options. Although stem cell therapy is a promising strategy, it still requires improvement of cell survival, differentiation and tissue modulation. .Recently, microspheres (MPs) made of biodegradable polymers have gained significant attention for providing necessary support of transplanted cells. To investigate this strategy in the cell therapy of stroke, MPs loaded with transcription factors Wnt3A/BMP4 were prepared. These proteins have been shown to mediate the maturation of the cortical neurons. We have suggested that implantation of these materials could create a suitable microenvironment for implanted cells. Particles with spherical shape, porous surface, and 5-40 m in size (monitored by scanning electron microscopy) were made on the basis of the original PEA composed of adipic acid, L-phenylalanine and 1,4-butanediol. After 4 months transplantation of MPs in rodent brain, no inflammation was observed. Additionally, factors were successfully released from MPs and affected neuronal cell differentiation in in vitro. The in vivo study using loaded MPs is in progress. Another severe problem in biomedicine is prevention of surgical devices from biofilm formation. Antimicrobial polymeric coatings are most effective “shields” to protect surfaces/devices from biofilm formation. Among matrices for constructing the coatings preference should be given to bio-erodible polymers. Such types of coatings will play a role of “unstable seating” that will not allow bacteria to occupy the surface. In other words, bio-erodible coatings would be discomfort shelter for bacteria that along with releasing “killers of bacteria” should prevent the formation of biofilm. For this purpose, we selected an original biodegradable PEA composed of L-leucine, 1,6-hexanediol and sebacic acid as a bio-erodible matrix, and nanosilver (AgNPs) as a bactericidal agent (“killer of bacteria”). Such nanocomposite material is also promising in treatment of superficial wound and ulcer. The solubility of the PEA in ethanol allows to reduce AgNO3 to NPs directly in the solution, where the solvent served as a reductive agent, and the PEA served as NPs stabilizer. The photochemical reduction was selected as a basic method to form NPs. The obtained AgNPs were characterized by UV-spectroscopy, transmission electron microscope (TEM), and dynamic light scattering (DLS). According to the UV-data and TEM data the photochemical reduction resulted in spherical AgNPs with wide particle size distribution with a high contribution of the particles below 10 nm that are known as responsible for bactericidal activity of AgNPs. DLS study showed that average size of nanoparticles formed after photo-reduction in ethanol solution ranged within ca. 50 nm.

Keywords: biodegradable polymers, microparticles, nanocomposites, stem cell therapy, stroke

Procedia PDF Downloads 382
26830 Placebo Analgesia in Older Age: Evidence from Event-Related Potentials

Authors: Angelika Dierolf, K. Rischer, A. Gonzalez-Roldan, P. Montoya, F. Anton, M. Van der Meulen

Abstract:

Placebo analgesia is a powerful cognitive endogenous pain modulation mechanism with high relevance in pain treatment. Older people would benefit, especially from non-pharmacologic pain interventions, since this age group is disproportionately affected by acute and chronic pain, while pharmacological treatments are less suitable due to polypharmacy and age-related changes in drug metabolism. Although aging is known to affect neurobiological and physiological aspects of pain perception, as for example, changes in pain threshold and pain tolerance, its effects on cognitive pain modulation strategies, including placebo analgesia, have hardly been investigated so far. In the present study, we are assessing placebo analgesia in 35 older adults (60 years and older) and 35 younger adults (between 18 and 35 years). Acute pain was induced with short transdermal electrical pulses to the inner forearm, using a concentric stimulating electrode. Stimulation intensities were individually adjusted to the participant’s threshold. Next to the stimulation site, we applied sham transcutaneous electrical nerve stimulation (TENS). Participants were informed that sometimes the TENS device would be switched on (placebo condition), and sometimes it would be switched off (control condition). In reality, it was always switched off. Participants received alternating blocks of painful stimuli in the placebo and control condition and were asked to rate the intensity and unpleasantness of each stimulus on a visual analog scale (VAS). Pain-related evoked potentials were recorded with a 64-channel EEG. Preliminary results show a reduced placebo effect in older compared to younger adults in both behavioral and neurophysiological data. Older people experienced less subjective pain reduction under sham TENS treatment compared to younger adults, as evidenced by the VAS ratings. The N1 and P2 event-related potential components were generally reduced in the older group. While younger adults showed a reduced N1 and P2 under sham TENS treatment, this reduction was considerably smaller in older people. This reduced placebo effect in the older group suggests that cognitive pain modulation is altered in aging and may at least partly explain why older adults experience more pain. Our results highlight the need for a better understanding of the efficacy of non-pharmacological pain treatments in older adults and how these can be optimized to meet the specific requirements of this population.

Keywords: placebo analgesia, aging, acute pain, TENS, EEG

Procedia PDF Downloads 130
26829 Water Efficiency: Greywater Recycling

Authors: Melissa Lubitz

Abstract:

Water scarcity is one of the crucial challenges of our time. There needs to be a focus on creating a society where people and nature flourish, regardless of climatic conditions. One of the solutions we can look to is decentralized greywater recycling. The vision is simple. Every building has its own water source being greywater from the bath, shower, sink and washing machine. By treating this in the home, you can save 25-45% of potable water use and wastewater production, a reduction in energy consumption and CO2 emissions. This reusable water is clean, and safe to be used for toilet flushing, washing machine, and outdoor irrigation. Companies like Hydraloop have been committed to the greywater recycle-ready building concept for years. This means that drinking water conservation and water reuse are included as standards in the design of all new buildings. Sustainability and renewal go hand in hand. This vision includes not only optimizing water savings and waste reduction but also forging strong partnerships that bring this ambition to life. Together with regulators, municipalities and builders, a sustainable and water-conscious future is pursued. This is an opportunity to be part of a movement that is making a difference. By pushing this initiative forward, we become part of a growing community that resists dehydration, believes in sustainability, and is committed to a living environment at the forefront of change: sustainable living, where saving water is the norm and where we shape the future together.

Keywords: greywater, wastewater treatment, water conservation, circular water society

Procedia PDF Downloads 46
26828 Tip-Apex Distance as a Long-Term Risk Factor for Hospital Readmission Following Intramedullary Fixation of Intertrochanteric Fractures

Authors: Brandon Knopp, Matthew Harris

Abstract:

Purpose: Tip-apex distance (TAD) has long been discussed as a metric for determining risk of failure in the fixation of peritrochanteric fractures. TAD measurements over 25 millimeters (mm) have been associated with higher rates of screw cut out and other complications in the first several months after surgery. However, there is limited evidence for the efficacy of this measurement in predicting the long-term risk of negative outcomes following hip fixation surgery. The purpose of our study was to investigate risk factors including TAD for hospital readmission, loss of pre-injury ambulation and development of complications within 1 year after hip fixation surgery. Methods: A retrospective review of proximal hip fractures treated with single screw intramedullary devices between 2016 and 2020 was performed at a 327-bed regional medical center. Patients included had a postoperative follow-up of at least 12 months or surgery-related complications developing within that time. Results: 44 of the 67 patients in this study met the inclusion criteria with adequate follow-up post-surgery. There was a total of 10 males (22.7%) and 34 females (77.3%) meeting inclusion criteria with a mean age of 82.1 (± 12.3) at the time of surgery. The average TAD in our study population was 19.57mm and the average 1-year readmission rate was 15.9%. 3 out of 6 patients (50%) with a TAD > 25mm were readmitted within one year due to surgery-related complications. In contrast, 3 out of 38 patients (7.9%) with a TAD < 25mm were readmitted within one year due to surgery-related complications (p=0.0254). Individual TAD measurements, averaging 22.05mm in patients readmitted within 1 year of surgery and 19.18mm in patients not readmitted within 1 year of surgery, were not significantly different between the two groups (p=0.2113). Conclusions: Our data indicate a significant improvement in hospital readmission rates up to one year after hip fixation surgery in patients with a TAD < 25mm with a decrease in readmissions of over 40% (50% vs 7.9%). This result builds upon past investigations by extending the follow-up time to 1 year after surgery and utilizing hospital readmissions as a metric for surgical success. With the well-documented physical and financial costs of hospital readmission after hip surgery, our study highlights a reduction of TAD < 25mm as an effective method of improving patient outcomes and reducing financial costs to patients and medical institutions. No relationship was found between TAD measurements and secondary outcomes, including loss of pre-injury ambulation and development of complications.

Keywords: hip fractures, hip reductions, readmission rates, open reduction internal fixation

Procedia PDF Downloads 132
26827 Hierarchical Clustering Algorithms in Data Mining

Authors: Z. Abdullah, A. R. Hamdan

Abstract:

Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.

Keywords: clustering, unsupervised learning, algorithms, hierarchical

Procedia PDF Downloads 863
26826 Enhancing the Rollability of Cu-Ge-Ni Alloy through Heat Treatment Methods

Authors: Morteza Hadi

Abstract:

This research investigates the potential enhancement of the rollability of Cu-Ge-Ni alloy through the mitigation of microstructural and compositional inhomogeneities via two distinct heat treatment methods: homogenization and solution treatment. To achieve this objective, the alloy with the desired composition was fabricated using a vacuum arc remelting furnace (VAR), followed by sample preparation for microstructural, compositional, and heat treatment analyses at varying temperatures and durations. Characterization was conducted employing optical and scanning electron microscopy (SEM), X-ray diffraction (XRD), and Vickers hardness testing. The results obtained indicate that a minimum duration of 10 hours is necessary for adequate homogenization of the alloy at 750°C. This heat treatment effectively removes coarse dendrites from the casting microstructure and significantly reduces elemental separations. However, despite these improvements, the presence of a second phase with markedly different hardness from the matrix results in poor rolling ability for the alloy. The optimal time for solution treatment at various temperatures was determined, with the most effective cycle identified as 750°C for 2 hours, followed by rapid quenching in water. This process induces the formation of a single-phase microstructure and complete elimination of the second  phase, as confirmed by X-ray diffraction analysis. Results demonstrate a reduction in hardness by 30 Vickers, and the elimination of microstructural unevenness enables successful thickness reduction by up to 50% through rolling without encountering cracking.

Keywords: Cu-Ge-Ni alloy, homogenization. solution treatment, rollability

Procedia PDF Downloads 34
26825 End to End Monitoring in Oracle Fusion Middleware for Data Verification

Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan

Abstract:

In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.

Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring

Procedia PDF Downloads 462
26824 Sustainable Land Use Policy and Monitoring Urban Land Expansion in Kabul: A Case Study of Rapid Urbanization

Authors: Osama Hidayat, Yoshitaka Kajiat

Abstract:

Kabul is a city that is highly representative of Afghanistan’s rapid urbanization process. As the city rapidly expands, there are enormous challenges to the sustainable use of land resources. This paper evaluates land use change and urban spatial expansion, from 1950 to 2016, in Kabul the capital of Afghanistan, using satellite images, field observation, and socio-economic data. The discussion covers the reduction in rural-to-urban land conversion, the delineation of urban growth boundaries, arable land reclamation and the establishment of farmland protection areas, urban upgrading, and the investigation and prosecution of illegal construction. This paper considers the aspects of urbanization and land management systems in Afghanistan. Efficient frames are outlined in Kabul for the following elements: governmental self-restraint and policy modification. The paper concludes that Kabul’s sustainable land use practices can provide a reference for other cities in Afghanistan.

Keywords: urban land expansion, urbanization, land use policy, sustainable development

Procedia PDF Downloads 148
26823 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering

Authors: K. Umbleja, M. Ichino

Abstract:

Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.

Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis

Procedia PDF Downloads 149
26822 A Smart CAD Program for Custom Hand Orthosis Generation Based on Anthropometric Relationships

Authors: Elissa D. Ledoux, Eric J. Barth

Abstract:

Producing custom orthotic devices is a time-consuming and iterative process. Efficiency could be increased with a smart CAD program to rapidly generate custom part files for 3D printing, reducing the need for a skilled orthosis technician as well as the hands-on time required. Anthropometric data for the hand was analyzed in order to determine dimensional relationships and reduce the number of measurements needed to parameterize the hand. Using these relationships, a smart CAD package was developed to produce custom sized hand orthosis parts downloadable for 3D printing. Results showed that the number of anatomical parameters required could be reduced from 8 to 3, and the relationships hold for 5th to 95th percentile male hands. CAD parts regenerate correctly for the same range. This package could significantly impact the orthotics industry in terms of expedited production and reduction of required human resources and patient contact.

Keywords: CAD, hand, orthosis, orthotic, rehabilitation robotics, upper limb

Procedia PDF Downloads 206
26821 Urban Heat Island Intensity Assessment through Comparative Study on Land Surface Temperature and Normalized Difference Vegetation Index: A Case Study of Chittagong, Bangladesh

Authors: Tausif A. Ishtiaque, Zarrin T. Tasin, Kazi S. Akter

Abstract:

Current trend of urban expansion, especially in the developing countries has caused significant changes in land cover, which is generating great concern due to its widespread environmental degradation. Energy consumption of the cities is also increasing with the aggravated heat island effect. Distribution of land surface temperature (LST) is one of the most significant climatic parameters affected by urban land cover change. Recent increasing trend of LST is causing elevated temperature profile of the built up area with less vegetative cover. Gradual change in land cover, especially decrease in vegetative cover is enhancing the Urban Heat Island (UHI) effect in the developing cities around the world. Increase in the amount of urban vegetation cover can be a useful solution for the reduction of UHI intensity. LST and Normalized Difference Vegetation Index (NDVI) have widely been accepted as reliable indicators of UHI and vegetation abundance respectively. Chittagong, the second largest city of Bangladesh, has been a growth center due to rapid urbanization over the last several decades. This study assesses the intensity of UHI in Chittagong city by analyzing the relationship between LST and NDVI based on the type of land use/land cover (LULC) in the study area applying an integrated approach of Geographic Information System (GIS), remote sensing (RS), and regression analysis. Land cover map is prepared through an interactive supervised classification using remotely sensed data from Landsat ETM+ image along with NDVI differencing using ArcGIS. LST and NDVI values are extracted from the same image. The regression analysis between LST and NDVI indicates that within the study area, UHI is directly correlated with LST while negatively correlated with NDVI. It interprets that surface temperature reduces with increase in vegetation cover along with reduction in UHI intensity. Moreover, there are noticeable differences in the relationship between LST and NDVI based on the type of LULC. In other words, depending on the type of land usage, increase in vegetation cover has a varying impact on the UHI intensity. This analysis will contribute to the formulation of sustainable urban land use planning decisions as well as suggesting suitable actions for mitigation of UHI intensity within the study area.

Keywords: land cover change, land surface temperature, normalized difference vegetation index, urban heat island

Procedia PDF Downloads 262
26820 Investigating Anti-bacterial and Anti-Covid-19 Virus Properties and Mode of Action of Mg(Oh)₂ and Copper-Infused Mg(Oh)₂ Nanoparticles on Coated Polypropylene Surfaces

Authors: Saleh Alkarri, Melinda Frame, Dimple Sharma, John Cairney, Lee Maddan, Jin H. Kim, Jonathan O. Rayner, Teresa M. Bergholz, Muhammad Rabnawaz

Abstract:

Reported herein is an investigation of anti-bacterial and anti-virus properties, mode of action of Mg(OH)₂ and copper-infused Mg(OH)₂ nanoplatelets (NPs) on melt-compounded and thermally embossed polypropylene (PP) surfaces. The anti-viral activity for the NPs was studied in aqueous liquid suspensions against SARS-CoV-2, and the mode of action was investigated on neat NPs and PP samples that were thermally embossed with NPs. Anti-bacterial studies for melt-compounded NPs in PP confirmed approximately 1 log reduction of E. coli populations in 24 h, while for thermally embossed NPs, an 8 log reduction of E. coli populations was observed. In addition, the NPs exhibit anti-viral activity against SARS-CoV-2. Fluorescence microscopy revealed that reactive oxygen species (ROS) is the main mode of action through which Mg(OH)₂ and Cu-Infused Mg(OH)₂act against microbes. Plastics with anti-microbial surfaces from where biocides are non-leachable are highly desirable. This work provides a general fabrication strategy for developing anti-microbial plastic surfaces.

Keywords: anti-microbial activity, E. coli K-12 MG1655, anti-viral activity, SARS-CoV-2, copper-infused magnesium hydroxide, non-leachable, ROS, compounding, surface embossing, dyes

Procedia PDF Downloads 53
26819 WiFi Data Offloading: Bundling Method in a Canvas Business Model

Authors: Majid Mokhtarnia, Alireza Amini

Abstract:

Mobile operators deal with increasing in the data traffic as a critical issue. As a result, a vital responsibility of the operators is to deal with such a trend in order to create added values. This paper addresses a bundling method in a Canvas business model in a WiFi Data Offloading (WDO) strategy by which some elements of the model may be affected. In the proposed method, it is supposed to sell a number of data packages for subscribers in which there are some packages with a free given volume of data-offloaded WiFi complimentary. The paper on hands analyses this method in the views of attractiveness and profitability. The results demonstrate that the quality of implementation of the WDO strongly affects the final result and helps the decision maker to make the best one.

Keywords: bundling, canvas business model, telecommunication, WiFi data offloading

Procedia PDF Downloads 179
26818 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 264
26817 Environment-Friendly Biogas Technology: Comparative Analysis of Benefits as Perceived by Biogas Users and Non-User Livestock Farmers of Tehsil Jhang

Authors: Anees Raza, Liu Chunyan

Abstract:

Renewable energy technologies are need of the time and are already making the big impact in the climatic outlook of the world. Biogas technology is one of those, and it has a lot of benefits for its users. It is cost effective because it is produced from the raw material which is available free of cost to the livestock farmers. Bio-slurry, a by-product of biogas, is being used as fertilizer for the crops production and increasing soil fertility. There are many other household benefits of technology. Research paper discusses the benefits of biogas as perceived by the biogas users as well as non-users of Tehsil Jhang. Data were collected from 60 respondents (30 users and 30 non-users) selected purposively through validated and pre-tested interview schedule from the respondents. Collected data were analyzed by using Statistical Package for Social Sciences (SPSS). Household benefits like ‘makes cooking easy,’ ‘Less breathing issues for working women in kitchens’ and ‘Use of bio-slurry as organic fertilizer’ had the highly significant relationship between them with t-values of 3.24, 4.39 and 2.80 respectively. Responses of the respondents about environmental benefits of biogas technology showed that ‘less air pollution’ had a significant relationship between them while ‘less temperature rise up than due to the burning of wood /dung’ had the non-significant relationship in the responses of interviewed respondents. It was clear from the research that biogas users were becoming influential in convincing non-users to adopt this technology due to its noticeable benefits. Research area where people were depending on wood to be used as fire fuel could be helped in reduction of cutting of trees which will help in controlling deforestation and saving the environment.People should be encouraged in using of biogas technology through providing them subsidies and low mark up loans.

Keywords: biogas technology, deforestation, environmental benefits, renewable energy

Procedia PDF Downloads 241
26816 Juvenile Justice in Maryland: The Evidence Based Approach to Youth with History of Victimization and Trauma

Authors: Gabriela Wasileski, Debra L. Stanley

Abstract:

Maryland efforts to decrease the juvenile criminality and recidivism shifts towards evidence based sentencing. While in theory the evidence based sentencing has an impact on the reduction of juvenile delinquency and drug abuse; the assessment of juveniles’ risk and needs usually lacks crucial information about juvenile’s prior victimization. The Maryland Comprehensive Assessment and Service Planning (MCASP) Initiative is the primary tool for developing and delivering a treatment service plan for juveniles at risk. Even though it consists of evidence-based screening and assessment instruments very little is currently known regarding the effectiveness and the impact of the assessment in general. In keeping with Maryland’s priority to develop successful evidence-based recidivism reduction programs, this study examined results of assessments based on MCASP using a representative sample of the juveniles at risk and their assessment results. Specifically, it examined: (1) the results of the assessments in an electronic database (2) areas of need that are more frequent among delinquent youth in a system/agency, (3) the overall progress of youth in an agency’s care (4) the impact of child victimization and trauma experiences reported in the assessment. The project will identify challenges regarding the use of MCASP in Maryland, and will provide a knowledge base to support future research and practices.

Keywords: Juvenile Justice, assessment of risk and need, victimization and crime, recidivism

Procedia PDF Downloads 301
26815 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks

Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam

Abstract:

In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.

Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion

Procedia PDF Downloads 102
26814 Fault Tree Analysis (FTA) of CNC Turning Center

Authors: R. B. Patil, B. S. Kothavale, L. Y. Waghmode

Abstract:

Today, the CNC turning center becomes an important machine tool for manufacturing industry worldwide. However, as the breakdown of a single CNC turning center may result in the production of an entire plant being halted. For this reason, operations and preventive maintenance have to be minimized to ensure availability of the system. Indeed, improving the availability of the CNC turning center as a whole, objectively leads to a substantial reduction in production loss, operating, maintenance and support cost. In this paper, fault tree analysis (FTA) method is used for reliability analysis of CNC turning center. The major faults associated with the system and the causes for the faults are presented graphically. Boolean algebra is used for evaluating fault tree (FT) diagram and for deriving governing reliability model for CNC turning center. Failure data over a period of six years has been collected and used for evaluating the model. Qualitative and quantitative analysis is also carried out to identify critical sub-systems and components of CNC turning center. It is found that, at the end of the warranty period (one year), the reliability of the CNC turning center as a whole is around 0.61628.

Keywords: fault tree analysis (FTA), reliability analysis, risk assessment, hazard analysis

Procedia PDF Downloads 391