Search results for: low data rate
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30346

Search results for: low data rate

28636 Hydro-Chemical Characterization of Glacial Melt Waters Draining from Shaune Garang Glacier, Himachal Himalaya

Authors: Ramesh Kumar, Rajesh Kumar, Shaktiman Singh, Atar Singh, Anshuman Bhardwaj, Ravindra Kumar Sinha, Anupma Kumari

Abstract:

A detailed study of the ion chemistry of the Shaune Garnag glacier meltwater has been carried out to assess the role of active glacier in the chemical denudation rate. The chemical compositions of various ions in meltwater of the Shaune Garang glacier were analyzed during the melting period 2015 and 2016. Total 112 of melt water samples twice in a day were collected during ablation season of 2015 and 2016. To identify various factors controlling the dissolved ionic strength of Shaune Garang Glacier meltwater statistical analysis such as correlation matrix, Principle Component Analysis (PCA) and factor analysis were applied to deduce the result. Cation concentration for Ca²⁺ > Mg²⁺ > Na⁺ > K⁺ in the meltwater for both the years can be arranged in the order as Ca²⁺ > Mg²⁺ > Na⁺ > K⁺. Study showed that Ca²⁺ and HCO₃⁻ found to be dominant on the both melting period. Carbonate weathering identified as the dominant process controlling the dissolved ion chemistry of meltwater due to the high ratios of (Ca²⁺ + Mg²⁺) versus TZ+ and (Ca²⁺ + Mg²⁺) versus (Na⁺ + K⁺) in the study area. The cation denudation rate of the Shaune Garnag catchment is 3412.2 m⁻² a⁻¹, i.e. higher than the other glacierised catchment in the Himalaya, indicating intense chemical erosion in this catchment.

Keywords: Shaune Garang glacier, Hydrochemistry, chemical composition, cation denudation rate, carbonate weathering

Procedia PDF Downloads 371
28635 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques

Authors: Gizem Eser Erdek

Abstract:

This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.

Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet

Procedia PDF Downloads 74
28634 AI-Driven Solutions for Optimizing Master Data Management

Authors: Srinivas Vangari

Abstract:

In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.

Keywords: artificial intelligence, master data management, data governance, data quality

Procedia PDF Downloads 9
28633 Autologous Blood for Conjunctival Autograft Fixation in Primary Pterygium Surgery: a Systematic Review and Meta-Analysis

Authors: Mohamed Abdelmongy

Abstract:

Autologous Blood for Conjunctival Autograft Fixation in Primary Pterygium Surgery: A Systematic Review and Meta-analysis Hossam Zein1,2, Ammar Ismail1,3, Mohamed Abdelmongy1,4, Sherif Elsherif1,5,6, Ahmad Hassanen1,4, Basma Muhammad2, Fathy Assaf1,3, Ahmed Elsehili1,7, Ahmed Negida1,7, Shin Yamane9, Mohamed M. Abdel-Daim8,9 and Kazuaki Kadonosono9 https://www.ncbi.nlm.nih.gov/pubmed/30277146 BACKGROUND: Pterygium is a benign ocular lesion characterized by triangular fibrovascular growth of conjunctival tissue over the cornea. Patients complain of the bad cosmetic appearance, ocular surface irritation and decreased visual acuity if the pterygium is large enough to cause astigmatism or encroach on the pupil. The definitive treatment of pterygium is surgical removal. However, outcomes are compromised by recurrence . The aim of the current study is to systematically review the current literature to explore the efficacy and safety of fibrin glue, suture and autologous blood coagulum for conjunctivalautograft fixation in primary pterygium surgery. OBJECTIVES: To assess the effectiveness of fibrin glue compared to sutures and autologous blood coagulum in conjunctival autografting for the surgical treatment of pterygium. METHODS: During preparing this manuscript, we followed the steps adequately illustrated in the Cochrane Handbook for Systematic Reviews of Interventions version 5.3, and reported it according to the preferred reporting of systematic review and meta-analysis (PRISMA) statement guidelines. We searched PubMed, Ovid (both through Medline), ISI Web of Science, and Cochrane Central Register of Controlled Trials (Central) through January 2017, using the following keywords “Pterygium AND (blood OR glue OR suture)” SELECTION CRITERIA: We included all randomized controlled trials (RCTs) that met the following criteria: 1) comparing autologous blood vs fibrin glue for conjunctivalautograft fixation in primary pterygium surgery 2) comparing autologous blood vs sutures for conjunctivalautograft fixation in primary pterygium surgery DATA COLLECTION AND ANALYSIS: Two review authors independently screened the search results, assessed trial quality, and extracted data using standard methodological procedures expected by Cochrane. The extracted data included A) study design, sample size, and main findings, B) Baseline characteristics of patients included in this review including their age, sex, pterygium site and grade, and graft size. C) Study outcomes comprising 1) primary outcomes: recurrence rate 2) secondary outcomes: graft stability outcomes (graft retraction, graft displacement), operation time (min) and postoperative symptoms (pain, discomfort, foreign body sensation, tearing) MAIN RESULTS: We included 7 RCTs and The review included662eyes (Blood: 293; Glue: 198; Suture: 171). we assess the 1) primary outcomes: recurrence rate 2) secondary outcomes: graft stability outcomes (graft retraction, graft displacement), operation time (min) and postoperative symptoms (pain, discomfort, foreign body sensation, tearing) CONCLUSIONS: Autologous blood for conjunctivalautograft fixation in pterygium surgery is associated with lower graft stability than fibrin glue or sutures. It was not inferior to fibrin glue or sutures regarding recurrence rate. The overall quality of evidence is low. Further well designed RCTs are needed to fully explore the efficacy of this new technique.

Keywords: pterygium, autograft, ophthalmology, cornea

Procedia PDF Downloads 157
28632 Preliminary Study of Desiccant Cooling System under Algerian Climates

Authors: N. Hatraf, N. Moummi

Abstract:

The interest in air conditioning using renewable energies is increasing. The thermal energy produced from the solar energy can be converted to useful cooling and heating through the thermochemical or thermophysical processes by using thermally activated energy conversion systems. The ambient air contains so much water that very high dehumidification rates are required. For a continuous dehumidification of the process air, the water adsorbed on the desiccant material has to be removed, which is done by allowing hot air to flow through the desiccant material (regeneration). A solid desiccant cooling system transfers moisture from the inlet air to the silica gel by using two processes: Absorption process and the regeneration process. The main aim of this paper is to study how the dehumidification rate, the generation temperature and many other factors influence the efficiency of a solid desiccant system by using TRNSYS software. The results show that the desiccant system could be used to decrease the humidity rate of the entering air.

Keywords: dehumidification, efficiency, humidity, Trnsys

Procedia PDF Downloads 435
28631 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 152
28630 Electron Beam Effects on Kinetic Alfven Waves in the Cold Homogenous Plasma

Authors: Jaya Shrivastava

Abstract:

The particle aspect approach is adopted to investigate the trajectories of charged particles in the electromagnetic field of kinetic Alfven wave. Expressions are found for the dispersion relation, growth/damping rate and associated currents in the presence of electron beam in homogenous plasma. Kinetic effects of electrons and ions are included to study kinetic Alfven wave because both are important in the transition region. The plasma parameters appropriate to plasma sheet boundary layer are used. It is found that downward electron beam affects the dispersion relation, growth/damping-rate and associated currents in cold electron limit.

Keywords: magnetospheric physics, plasma waves and instabilities, electron beam, space plasma physics, wave-particle interactions

Procedia PDF Downloads 388
28629 Influence of Channel Depth on the Performance of Wavy Fin Absorber Solar Air Heater

Authors: Abhishek Priyam, Prabha Chand

Abstract:

Channel depth is an important design parameter to be fixed in designing a solar air heater. In this paper, a mathematical model has been developed to study the influence of channel duct on the thermal performance of solar air heaters. The channel depth has been varied from 1.5 cm to 3.5 cm for the mass flow range 0.01 to 0.11 kg/s. Based on first law of thermodynamics, the channel depth of 1.5 cm shows better thermal performance for all the mass flow range. Also, better thermohydraulic performance has been found up to 0.05 kg/s, and beyond this, thermohydraulic efficiency starts decreasing. It has been seen that, with the increase in the mass flow rate, the difference between thermal and thermohydraulic efficiency increases because of the increase in pressure drop. At lower mass flow rate, 0.01 kg/s, the thermal and thermohydraulic efficiencies for respective channel depth remain the same.

Keywords: channel depth, thermal efficiency, wavy fin, thermohydraulic efficiency

Procedia PDF Downloads 370
28628 The External Debt in the Context of Economic Growth: The Sample of Turkey

Authors: Ayşen Edirneligil, Mehmet Mucuk

Abstract:

In developing countries, one of the most important restrictions about the economic growth is the lack of national savings which are supposed to finance the investments. In order to overcome this restriction and achieve the higher rate of economic growth by increasing the level of output, countries choose the external borrowing. However, there is a dispute in the literature over the correlation between external debt and economic growth. The aim of this study is to examine the effects of external debt on Turkish economic growth by using VAR analysis with the quarterly data over the period of 2002:01-2014:04. In this respect, Johansen Cointegration Test, Impulse- Response Function and Variance Decomposition Tests will be used for analyses. Empirical findings show that there is no cointegration in the long run.

Keywords: external debt, economic growth, Turkish economy, time series analysis

Procedia PDF Downloads 395
28627 Aliasing Free and Additive Error in Spectra for Alpha Stable Signals

Authors: R. Sabre

Abstract:

This work focuses on the symmetric alpha stable process with continuous time frequently used in modeling the signal with indefinitely growing variance, often observed with an unknown additive error. The objective of this paper is to estimate this error from discrete observations of the signal. For that, we propose a method based on the smoothing of the observations via Jackson polynomial kernel and taking into account the width of the interval where the spectral density is non-zero. This technique allows avoiding the “Aliasing phenomenon” encountered when the estimation is made from the discrete observations of a process with continuous time. We have studied the convergence rate of the estimator and have shown that the convergence rate improves in the case where the spectral density is zero at the origin. Thus, we set up an estimator of the additive error that can be subtracted for approaching the original signal without error.

Keywords: spectral density, stable processes, aliasing, non parametric

Procedia PDF Downloads 126
28626 Uranium Adsorption Using a Composite Material Based on Platelet SBA-15 Supported Tin Salt Tungstomolybdophosphoric Acid

Authors: H. Aghayan, F. A. Hashemi, R. Yavari, S. Zolghadri

Abstract:

In this work, a new composite adsorbent based on a mesoporous silica SBA-15 with platelet morphology and tin salt of tungstomolybdophosphoric (TWMP) acid was synthesized and applied for uranium adsorption from aqueous solution. The sample was characterized by X-ray diffraction, Fourier transfer infra-red, and N2 adsorption-desorption analysis, and then, effect of various parameters such as concentration of metal ions and contact time on adsorption behavior was examined. The experimental result showed that the adsorption process was explained by the Langmuir isotherm model very well, and predominant reaction mechanism is physisorption. Kinetic data of adsorption suggest that the adsorption process can be described by the pseudo second-order reaction rate model.

Keywords: platelet SBA-15, tungstomolybdophosphoric acid, adsorption, uranium ion

Procedia PDF Downloads 183
28625 Dispositional Loneliness and Mental Health of the Elderly in Cross River State, Nigeria

Authors: Peter Unoh Bassey

Abstract:

The study is predicated on the current trend of the rate of dispositional loneliness experienced by the elderly in society today as a result of the breakdown in the family attachment patterns, loss of close associates, and interpersonal conflicts. The research adopted the ex-post facto research design through a survey data collected from a total of 500 elderly comprising of both retirees and community-based elders. Both the stratified and simple sampling techniques were used to select the sample. Based on the findings, it was recommended that the elderly should be trained in acquiring specific attachment styles as well as be trained in developing appropriate social skills to counter loneliness.

Keywords: dispositional loneliness, mental health, elderly, cross river state

Procedia PDF Downloads 151
28624 Sustainable Water Supply: Rainwater Harvesting as Flood Reduction Measures in Ibadan, Nigeria

Authors: Omolara Lade, David Oloke

Abstract:

Ibadan City suffers serious water supply problems; cases of dry taps are common in virtually every part of the City. The scarcity of piped water has made communities find alternative water sources; groundwater sources being a ready source. These wells are prone to pollution due to the close proximity of septic tanks to wells, disposal of solid or liquid wastes in pits, abandoned boreholes or even stream channels and landfills. Storms and floods in Ibadan have increased with consequent devastating effects claiming over 120 lives and displacing 600 people on August 2011 alone. In this study, an analysis of the water demand and sources of supply for the city was carried out through questionnaire survey and collection of data from City’s main water supply - Water Corporation of Oyo State (WCOS), groundwater sources were explored and 30 years rainfall data were collected from Meteorological station in Ibadan. 1067 questionnaire were administered at household level with a response rate of 86.7 %. A descriptive analysis of the survey revealed that 77.1 % of the respondents did not receive water at all from WCOS while 83.8 % depend on groundwater sources. Analysis of data from WCOS revealed that main water supply is inadequate as < 10 % of the population water demand was met. Rainfall intensity is highest in June with a mean value of 188 mm, which can be harvested at community—based level and used to complement the population water demand. Rainwater harvesting if planned, and managed properly will become a valuable alternative source of managing urban flood and alleviating water scarcity in the city.

Keywords: Ibadan, rainwater harvesting, sustainable water, urban flooding

Procedia PDF Downloads 177
28623 Estimation of Greenhouse Gas (GHG) Reductions from Solar Cell Technology Using Bottom-up Approach and Scenario Analysis in South Korea

Authors: Jaehyung Jung, Kiman Kim, Heesang Eum

Abstract:

Solar cell is one of the main technologies to reduce greenhouse gas (GHG). Thereby, accurate estimation of greenhouse gas reduction by solar cell technology is crucial to consider strategic applications of the solar cell. The bottom-up approach using operating data such as operation time and efficiency is one of the methodologies to improve the accuracy of the estimation. In this study, alternative GHG reductions from solar cell technology were estimated by a bottom-up approach to indirect emission source (scope 2) in Korea, 2015. In addition, the scenario-based analysis was conducted to assess the effect of technological change with respect to efficiency improvement and rate of operation. In order to estimate GHG reductions from solar cell activities in operating condition levels, methodologies were derived from 2006 IPCC guidelines for national greenhouse gas inventories and guidelines for local government greenhouse inventories published in Korea, 2016. Indirect emission factors for electricity were obtained from Korea Power Exchange (KPX) in 2011. As a result, the annual alternative GHG reductions were estimated as 21,504 tonCO2eq, and the annual average value was 1,536 tonCO2eq per each solar cell technology. Those results of estimation showed to be 91% levels versus design of capacity. Estimation of individual greenhouse gases (GHGs) showed that the largest gas was carbon dioxide (CO2), of which up to 99% of the total individual greenhouse gases. The annual average GHG reductions from solar cell per year and unit installed capacity (MW) were estimated as 556 tonCO2eq/yr•MW. Scenario analysis of efficiency improvement by 5%, 10%, 15% increased as much as approximately 30, 61, 91%, respectively, and rate of operation as 100% increased 4% of the annual GHG reductions.

Keywords: bottom-up approach, greenhouse gas (GHG), reduction, scenario, solar cell

Procedia PDF Downloads 218
28622 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 336
28621 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 429
28620 Simulation-Based Parametric Study for the Hybrid Superplastic Forming of AZ31

Authors: Fatima Ghassan Al-Abtah, Naser Al-Huniti, Elsadig Mahdi

Abstract:

As the lightest constructional metal on earth, magnesium alloys offer excellent potential for weight reduction in the transportation industry, and it was observed that some magnesium alloys exhibit superior ductility and superplastic behavior at high temperatures. The main limitation of the superplastic forming (SPF) includes the low production rate since it needs a long forming time for each part. Through this study, an SPF process that starts with a mechanical pre-forming stage is developed to promote formability and reduce forming time. A two-dimensional finite element model is used to simulate the process. The forming process consists of two steps. At the pre-forming step (deep drawing), the sheet is drawn into the die to a preselected level, using a mechanical punch, and at the second step (SPF) a pressurized gas is applied at a controlled rate. It is shown that a significant reduction in forming time and improved final thickness uniformity can be achieved when the hybrid forming technique is used, where the process achieved a fully formed part at 400°C. Investigation for the impact of different forming process parameters achieved by comparing forming time and the distribution of final thickness that were obtained from the simulation analysis. Maximum thinning decreased from over 67% to less than 55% and forming time significantly decreased by more than 6 minutes, and the required gas pressure profile was predicted for optimum forming process parameters based on the 0.001/sec target constant strain rate within the sheet.

Keywords: magnesium, plasticity, superplastic forming, finite element analysis

Procedia PDF Downloads 151
28619 New Result for Optical OFDM in Code Division Multiple Access Systems Using Direct Detection

Authors: Cherifi Abdelhamid

Abstract:

In optical communication systems, OFDM has received increased attention as a means to overcome various limitations of optical transmission systems such as modal dispersion, relative intensity noise, chromatic dispersion, polarization mode dispersion and self-phase modulation. The multipath dispersion limits the maximum transmission data rates. In this paper we investigate OFDM system where multipath induced intersymbol interference (ISI) is reduced and we increase the number of users by combining OFDM system with OCDMA system using direct detection Incorporate OOC (orthogonal optical code) for minimize a bit error rate.

Keywords: OFDM, OCDMA, OOC (orthogonal optical code), (ISI), prim codes (Pc)

Procedia PDF Downloads 649
28618 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 401
28617 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 739
28616 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 233
28615 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 416
28614 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 273
28613 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management

Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang

Abstract:

Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.

Keywords: construction supply chain management, BIM, data exchange, artificial intelligence

Procedia PDF Downloads 15
28612 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 423
28611 The Neoliberal Social-Economic Development and Values in the Baltic States

Authors: Daiva Skuciene

Abstract:

The Baltic States turned to free market and capitalism after independency. The new socioeconomic system, democracy and priorities about the welfare of citizens formed. The researches show that Baltic states choose the neoliberal development. Related to this neoliberal path, a few questions arouse: how do people evaluate the results of such policy and socioeconomic development? What are their priorities? And what are the values of the Baltic societies that support neoliberal policy? The purpose of this research – to analyze the socioeconomic context and the priorities and the values of the Baltics societies related to neoliberal regime. The main objectives are: firstly, to analyze the neoliberal socioeconomic features and results; secondly, to analyze people opinions and priorities about the results of neoliberal development; thirdly, to analyze the values of the Baltic societies related to the neoliberal policy. For the implementation of the purpose and objectives, the comparative analyses among European countries are used. The neoliberal regime was defined through two indicators: the taxes on capital income and expenditures on social protection. The socioeconomic outcomes of neoliberal welfare regime are defined through the Gini inequality and at risk of the poverty rate. For this analysis, the data of 2002-2013 of Eurostat were used. For the analyses of opinion about inequality and preferences on society, people want to live in, the preferences for distribution between capital and wages in enterprise data of Eurobarometer in 2010-2014 and the data of representative survey in the Baltic States in 2016 were used. The justice variable was selected as a variable reflecting the evaluation of socioeconomic context and analyzed using data of Eurobarometer 2006-2015. For the analyses of values were selected: solidarity, equality, and individual responsibility. The solidarity, equality was analyzed using data of Eurobarometer 2006-2015. The value “individual responsibility” was examined by opinions about reasons of inequality and poverty. The survey of population in the Baltic States in 2016 and data of Eurobarometer were used for this aim. The data are ranged in descending order for understanding the position of opinion of people in the Baltic States among European countries. The dynamics of indicators is also provided to examine stability of values. The main findings of the research are that people in the Baltics are dissatisfied with the results of the neoliberal socioeconomic development, they have priorities for equality and justice, but they have internalized the main neoliberal narrative- individual responsibility. The impact of socioeconomic context on values is huge, resulting in a change in quite stable opinions and values during the period of the financial crisis.

Keywords: neoliberal, inequality and poverty, solidarity, individual responsibility

Procedia PDF Downloads 254
28610 Monitoring and Improving Performance of Soil Aquifer Treatment System and Infiltration Basins of North Gaza Emergency Sewage Treatment Plant as Case Study

Authors: Sadi Ali, Yaser Kishawi

Abstract:

As part of Palestine, Gaza Strip (365 km2 and 1.8 million habitants) is considered a semi-arid zone relies solely on the Coastal Aquifer. The coastal aquifer is only source of water with only 5-10% suitable for human use. This barely covers the domestic and agricultural needs of Gaza Strip. Palestinian Water Authority Strategy is to find non-conventional water resource from treated wastewater to irrigate 1500 hectares and serves over 100,000 inhabitants. A new WWTP project is to replace the old-overloaded Biet Lahia WWTP. The project consists of three parts; phase A (pressure line & 9 infiltration basins - IBs), phase B (a new WWTP) and phase C (Recovery and Reuse Scheme – RRS – to capture the spreading plume). Currently, phase A is functioning since Apr 2009. Since Apr 2009, a monitoring plan is conducted to monitor the infiltration rate (I.R.) of the 9 basins. Nearly 23 million m3 of partially treated wastewater were infiltrated up to Jun 2014. It is important to maintain an acceptable rate to allow the basins to handle the coming quantities (currently 10,000 m3 are pumped an infiltrated daily). The methodology applied was to review and analysis the collected data including the I.R.s, the WW quality and the drying-wetting schedule of the basins. One of the main findings is the relation between the Total Suspended Solids (TSS) at BLWWTP and the I.R. at the basins. Since April 2009, the basins scored an average I.R. of about 2.5 m/day. Since then the records showed a decreasing pattern of the average rate until it reached the lower value of 0.42 m/day in Jun 2013. This was accompanied with an increase of TSS (mg/L) concentration at the source reaching above 200 mg/L. The reducing of TSS concentration directly improved the I.R. (by cleaning the WW source ponds at Biet Lahia WWTP site). This was reflected in an improvement in I.R. in last 6 months from 0.42 m/day to 0.66 m/day then to nearly 1.0 m/day as the average of the last 3 months of 2013. The wetting-drying scheme of the basins was observed (3 days wetting and 7 days drying) besides the rainfall rates. Despite the difficulty to apply this scheme accurately a control of flow to each basin was applied to improve the I.R. The drying-wetting system affected the I.R. of individual basins, thus affected the overall system rate which was recorded and assessed. Also the ploughing activities at the infiltration basins as well were recommended at certain times to retain a certain infiltration level. This breaks the confined clogging layer which prevents the infiltration. It is recommended to maintain proper quality of WW infiltrated to ensure an acceptable performance of IBs. The continual maintenance of settling ponds at BLWWTP, continual ploughing of basins and applying soil treatment techniques at the IBs will improve the I.R.s. When the new WWTP functions a high standard effluent quality (TSS 20mg, BOD 20 mg/l, and TN 15 mg/l) will be infiltrated, thus will enhance I.R.s of IBs due to lower organic load.

Keywords: soil aquifer treatment, recovery and reuse scheme, infiltration basins, North Gaza

Procedia PDF Downloads 240
28609 Data Mining As A Tool For Knowledge Management: A Review

Authors: Maram Saleh

Abstract:

Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.

Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.

Procedia PDF Downloads 202
28608 Comparative Study of Dermal Regeneration Template Made by Bovine Collagen with and without Silicone Layer in the Treatment of Post-Burn Contracture

Authors: Elia Caldini, Cláudia N. Battlehner, Marcelo A. Ferreira, Rolf Gemperli, Nivaldo Alonso, Luiz P. Vana

Abstract:

The advent of dermal regenerate templates has fostered major advances in the treatment of acute burns and their sequelae, in the last two decades. Both data on morphological aspects of the newly-formed tissue, and clinical trials comparing different templates, are still lacking. The goal of this study was to prospectively analyze the outcome of patients treated with two of the existing templates, followed by thin skin autograft. They are both made of bovine collagen, one includes a superficial silicone layer. Surgery was performed on patients with impaired mobility resulting from burn sequelae (n = 12 per template). Negative pressure therapy was applied post-surgically; patients were monitored for 12 months. Data on scar skin quality (Vancouver and POSAS evaluation scales), rate of joint mobility recovery, and graft contraction were recorded. Improvement in mobility and skin quality were demonstrated along with graft contraction, in all patients. The silicone-coupled template showed the best performance in all aspects.

Keywords: dermal regeneration template, artificial skin, skin quality, scar contracture

Procedia PDF Downloads 145
28607 Analysis of Delamination in Drilling of Composite Materials

Authors: Navid Zarif Karimi, Hossein Heidary, Giangiacomo Minak, Mehdi Ahmadi

Abstract:

In this paper analytical model based on the mechanics of oblique cutting, linear elastic fracture mechanics (LEFM) and bending plate theory has been presented to determine the critical feed rate causing delamination in drilling of composite materials. Most of the models in this area used LEFM and bending plate theory; hence, they can only determine the critical thrust force which is an incorporable parameter. In this model by adding cutting oblique mechanics to previous models, critical feed rate has been determined. Also instead of simplification in loading condition, actual thrust force induced by chisel edge and cutting lips on composite plate is modeled.

Keywords: composite material, delamination, drilling, thrust force

Procedia PDF Downloads 512