Search results for: data reconstruction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25277

Search results for: data reconstruction

24827 Comparing the Knee Kinetics and Kinematics during Non-Steady Movements in Recovered Anterior Cruciate Ligament Injured Badminton Players against an Uninjured Cohort: Case-Control Study

Authors: Anuj Pathare, Aleksandra Birn-Jeffery

Abstract:

Background: The Anterior Cruciate Ligament(ACL) helps stabilize the knee joint minimizing tibial anterior translation. Anterior Cruciate Ligament (ACL) injury is common in racquet sports and often occurs due to sudden acceleration, deceleration or changes of direction. This mechanism in badminton most commonly occurs during landing after an overhead stroke. Knee biomechanics during dynamic movements such as walking, running and stair negotiation, do not return to normal for more than a year after an ACL reconstruction. This change in the biomechanics may lead to re-injury whilst performing non-steady movements during sports, where these injuries are most prevalent. Aims: To compare if the knee kinetics and kinematics in ACL injury recovered athletes return to the same level as those from an uninjured cohort during standard movements used for clinical assessment and badminton shots. Objectives: The objectives of the study were to determine: Knee valgus during the single leg squat, vertical drop jump, net shot and drop shot; Degree of internal or external rotation during the single leg squat, vertical drop jump, net shot and drop shot; Maximum knee flexion during the single leg squat, vertical drop jump and net shot. Methods: This case-control study included 14 participants with three ACL injury recovered athletes and 11 uninjured participants. The participants performed various functional tasks including vertical drop jump, single leg squat; the forehand net shot and the forehand drop shot. The data was analysed using the two-way ANOVA test, and the reliability of the data was evaluated using the Intra Class Coefficient. Results: The data showed a significant decrease in the range of knee rotation in ACL injured participants as compared to the uninjured cohort (F₇,₅₅₆=2.37; p=0.021). There was also a decrease in the maximum knee flexion angles and an increase in knee valgus angles in ACL injured participants although they were not statistically significant. Conclusion: There was a significant decrease in the knee rotation angles in the ACL injured participants which could be a potential cause for re-injury in these athletes in the future. Although the results for decrease in maximum knee flexion angles and increase in knee valgus angles were not significant, this may be due to a limited sample of ACL injured participants; there is potential for it to be identified as a variable of interest in the rehabilitation of ACL injuries. These changes in the knee biomechanics could be vital in the rehabilitation of ACL injured athletes in the future, and an inclusion of sports based tasks, e.g., Net shot along with standard protocol movements for ACL assessment would provide a better measure of the rehabilitation of the athlete.

Keywords: ACL, biomechanics, knee injury, racquet sport

Procedia PDF Downloads 170
24826 Spectroscopic Autoradiography of Alpha Particles on Geologic Samples at the Thin Section Scale Using a Parallel Ionization Multiplier Gaseous Detector

Authors: Hugo Lefeuvre, Jerôme Donnard, Michael Descostes, Sophie Billon, Samuel Duval, Tugdual Oger, Herve Toubon, Paul Sardini

Abstract:

Spectroscopic autoradiography is a method of interest for geological sample analysis. Indeed, researchers may face different issues such as radioelement identification and quantification in the field of environmental studies. Imaging gaseous ionization detectors find their place in geosciences for conducting specific measurements of radioactivity to improve the monitoring of natural processes using naturally-occurring radioactive tracers, but also for the nuclear industry linked to the mining sector. In geological samples, the location and identification of the radioactive-bearing minerals at the thin-section scale remains a major challenge as the detection limit of the usual elementary microprobe techniques is far higher than the concentration of most of the natural radioactive decay products. The spatial distribution of each decay product in the case of uranium in a geomaterial is interesting for relating radionuclides concentration to the mineralogy. The present study aims to provide spectroscopic autoradiography analysis method for measuring the initial energy of alpha particles with a parallel ionization multiplier gaseous detector. The analysis method has been developed thanks to Geant4 modelling of the detector. The track of alpha particles recorded in the gas detector allow the simultaneous measurement of the initial point of emission and the reconstruction of the initial particle energy by a selection based on the linear energy distribution. This spectroscopic autoradiography method was successfully used to reproduce the alpha spectra from a 238U decay chain on a geological sample at the thin-section scale. The characteristics of this measurement are an energy spectrum resolution of 17.2% (FWHM) at 4647 keV and a spatial resolution of at least 50 µm. Even if the efficiency of energy spectrum reconstruction is low (4.4%) compared to the efficiency of a simple autoradiograph (50%), this novel measurement approach offers the opportunity to select areas on an autoradiograph to perform an energy spectrum analysis within that area. This opens up possibilities for the detailed analysis of heterogeneous geological samples containing natural alpha emitters such as uranium-238 and radium-226. This measurement will allow the study of the spatial distribution of uranium and its descendants in geo-materials by coupling scanning electron microscope characterizations. The direct application of this dual modality (energy-position) of analysis will be the subject of future developments. The measurement of the radioactive equilibrium state of heterogeneous geological structures, and the quantitative mapping of 226Ra radioactivity are now being actively studied.

Keywords: alpha spectroscopy, digital autoradiography, mining activities, natural decay products

Procedia PDF Downloads 146
24825 Metrology in Egyptian Architecture, Interrelation with Archaeology

Authors: Monica M. Marcos

Abstract:

In the framework of Archaeological Research, Heritage Conservation and Restoration, the object of study is metrology applied in composition of religious architecture in ancient Egypt, and usefulness in Archaology. The objective is the determination of the geometric and metrological relations in architectural models and the module used in the initial project of the buildings. The study and data collection of religious buildings, tombs and temples of the ancient Egypt, is completed with plans. The measurements systematization and buildings modulation makes possible to establish common compositional parameters, with a module determined by the measurement unit used. The measurement system corresponding to the main period of egyptian history, was the Egyptian royal cubit. The analysis of units measurements, used in architectural design, provides exact numbers on buildable spaces dimensions. It allows establishing proportional relationships between them, and finding a geometric composition module, on which the original project was based. This responds to a philosophical and functional concept of projected spaces. In the heritage rehabilitation and restoration field, knowledge of metrology helps in excavation, reconstruction and restoration of construction elements. The correct use of metrology contributes to the identification of possible work areas, helping to locate where the damaged or missing areas are. Also in restoration projects, metrology is useful for reordering and locating decontextualized parts of buildings. The conversion of measurements taken in the current International System to the ancient egyptian measurements, allows understand its conceptual purpose and its functionality, which makes easier to carry out archaeological intervention. In the work carried out in archaeological excavations, metrology is an essential tool for locating sites and establishing work zones.

Keywords: egyptology, metrology, archaeology, measurements, Egyptian cubit

Procedia PDF Downloads 17
24824 Introducing Future Smart Transport Solution for Women with Disabilities: A Review with Chongqing as the Focal Example

Authors: Xinyi Gao, Xiaoyun Feng, Ruijie Liu, Yumin Xia, Min Shao, Xinqing Wang

Abstract:

This paper outlines the travel challenges, the absence of society, and studies around disabled women and chooses the Chongqing area as a case study to explore how terrain characteristics and city construction influence our subject's travel choice. It also highlights future transport options and the necessity of addressing the difficult travel position of women with disabilities. This study focuses on the travel demands of women with disabilities, illustrating what their ideal method of travel would be. An analysis of related smart cities like Hong Kong illustrates the aspects to consider in the reconstruction of Chongqing. Finally, relying on current smart city modelling approaches, several design ideas for assistive tools are suggested for the safety of women with disabilities during travel.

Keywords: future smart city, disabled women, Chongqing, inclusive design, human-computer interaction

Procedia PDF Downloads 113
24823 The Influence of Modern Islamic Thought Liberalization to the Improvement of Science

Authors: Muhammad Ilham Agus Salim

Abstract:

The liberalization of Islamic thought is not only an impact on the views of Muslim community regarding worldview, but has touched the stage reconstruction of contemporary general science. It can be seen from the emergence of Western and Eastern intellectual movements that try to reconstruct contemporary science arguing that scientific culture is not currently able to deliver audiences to change the order of the better society. Such Islamic thought liberalization has a huge influence on the multidimensional crisis in various sectors such as the economic, culture, politic, ecology, and other sectors. Therefore, this paper examines the effects of the liberalization of contemporary Islamic thought towards on the development of modern science. The method used in this paper is based on textual study of Al -Qur'an, Hadith (prophetic tradition), and the history of contemporary Islamic thought and comparing it with the reality of the development of science today. So the influence of Islamic thought liberalization has created a crisis and stagnation of the development of scientific disciplines can be found.

Keywords: liberalization, science, Islam, al-Qur’an textual studies

Procedia PDF Downloads 393
24822 Effect of Preloading on Long-Term Settlement of Closed Landfills: A Numerical Analysis

Authors: Mehrnaz Alibeikloo, Hajar Share Isfahani, Hadi Khabbaz

Abstract:

In recent years, by developing cities and increasing population, reconstructing on closed landfill sites in some regions is unavoidable. Long-term settlement is one of the major concerns associated with reconstruction on landfills after closure. The purpose of this research is evaluating the effect of preloading in various patterns of height and time on long-term settlements of closed landfills. In this regard, five scenarios of surcharge from 1 to 3 m high within 3, 4.5 and 6 months of preloading time have been modeled using PLAXIS 2D software. Moreover, the numerical results have been compared to those obtained from analytical methods, and a good agreement has been achieved. The findings indicate that there is a linear relationship between settlement and surcharge height. Although, long-term settlement decreased by applying a longer and higher preloading, the time of preloading was found to be a more effective factor compared to preloading height.

Keywords: preloading, long-term settlement, landfill, PLAXIS 2D

Procedia PDF Downloads 190
24821 Cryptographic Protocol for Secure Cloud Storage

Authors: Luvisa Kusuma, Panji Yudha Prakasa

Abstract:

Cloud storage, as a subservice of infrastructure as a service (IaaS) in Cloud Computing, is the model of nerworked storage where data can be stored in server. In this paper, we propose a secure cloud storage system consisting of two main components; client as a user who uses the cloud storage service and server who provides the cloud storage service. In this system, we propose the protocol schemes to guarantee against security attacks in the data transmission. The protocols are login protocol, upload data protocol, download protocol, and push data protocol, which implement hybrid cryptographic mechanism based on data encryption before it is sent to the cloud, so cloud storage provider does not know the user's data and cannot analysis user’s data, because there is no correspondence between data and user.

Keywords: cloud storage, security, cryptographic protocol, artificial intelligence

Procedia PDF Downloads 351
24820 Decentralized Data Marketplace Framework Using Blockchain-Based Smart Contract

Authors: Meshari Aljohani, Stephan Olariu, Ravi Mukkamala

Abstract:

Data is essential for enhancing the quality of life. Its value creates chances for users to profit from data sales and purchases. Users in data marketplaces, however, must share and trade data in a secure and trusted environment while maintaining their privacy. The first main contribution of this paper is to identify enabling technologies and challenges facing the development of decentralized data marketplaces. The second main contribution is to propose a decentralized data marketplace framework based on blockchain technology. The proposed framework enables sellers and buyers to transact with more confidence. Using a security deposit, the system implements a unique approach for enforcing honesty in data exchange among anonymous individuals. Before the transaction is considered complete, the system has a time frame. As a result, users can submit disputes to the arbitrators which will review them and respond with their decision. Use cases are presented to demonstrate how these technologies help data marketplaces handle issues and challenges.

Keywords: blockchain, data, data marketplace, smart contract, reputation system

Procedia PDF Downloads 154
24819 A Runge Kutta Discontinuous Galerkin Method for Lagrangian Compressible Euler Equations in Two-Dimensions

Authors: Xijun Yu, Zhenzhen Li, Zupeng Jia

Abstract:

This paper presents a new cell-centered Lagrangian scheme for two-dimensional compressible flow. The new scheme uses a semi-Lagrangian form of the Euler equations. The system of equations is discretized by Discontinuous Galerkin (DG) method using the Taylor basis in Eulerian space. The vertex velocities and the numerical fluxes through the cell interfaces are computed consistently by a nodal solver. The mesh moves with the fluid flow. The time marching is implemented by a class of the Runge-Kutta (RK) methods. A WENO reconstruction is used as a limiter for the RKDG method. The scheme is conservative for the mass, momentum and total energy. The scheme maintains second-order accuracy and has free parameters. Results of some numerical tests are presented to demonstrate the accuracy and the robustness of the scheme.

Keywords: cell-centered Lagrangian scheme, compressible Euler equations, RKDG method

Procedia PDF Downloads 544
24818 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems

Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan

Abstract:

Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.

Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine

Procedia PDF Downloads 304
24817 Speech Enhancement Using Kalman Filter in Communication

Authors: Eng. Alaa K. Satti Salih

Abstract:

Revolutions Applications such as telecommunications, hands-free communications, recording, etc. which need at least one microphone, the signal is usually infected by noise and echo. The important application is the speech enhancement, which is done to remove suppressed noises and echoes taken by a microphone, beside preferred speech. Accordingly, the microphone signal has to be cleaned using digital signal processing DSP tools before it is played out, transmitted, or stored. Engineers have so far tried different approaches to improving the speech by get back the desired speech signal from the noisy observations. Especially Mobile communication, so in this paper will do reconstruction of the speech signal, observed in additive background noise, using the Kalman filter technique to estimate the parameters of the Autoregressive Process (AR) in the state space model and the output speech signal obtained by the MATLAB. The accurate estimation by Kalman filter on speech would enhance and reduce the noise then compare and discuss the results between actual values and estimated values which produce the reconstructed signals.

Keywords: autoregressive process, Kalman filter, Matlab, noise speech

Procedia PDF Downloads 340
24816 Discussion on Big Data and One of Its Early Training Application

Authors: Fulya Gokalp Yavuz, Mark Daniel Ward

Abstract:

This study focuses on a contemporary and inevitable topic of Data Science and its exemplary application for early career building: Big Data and Leaving Learning Community (LLC). ‘Academia’ and ‘Industry’ have a common sense on the importance of Big Data. However, both of them are in a threat of missing the training on this interdisciplinary area. Some traditional teaching doctrines are far away being effective on Data Science. Practitioners needs some intuition and real-life examples how to apply new methods to data in size of terabytes. We simply explain the scope of Data Science training and exemplified its early stage application with LLC, which is a National Science Foundation (NSF) founded project under the supervision of Prof. Ward since 2014. Essentially, we aim to give some intuition for professors, researchers and practitioners to combine data science tools for comprehensive real-life examples with the guides of mentees’ feedback. As a result of discussing mentoring methods and computational challenges of Big Data, we intend to underline its potential with some more realization.

Keywords: Big Data, computation, mentoring, training

Procedia PDF Downloads 354
24815 An Exploratory Study on 'Sub-Region Life Circle' in Chinese Big Cities Based on Human High-Probability Daily Activity: Characteristic and Formation Mechanism as a Case of Wuhan

Authors: Zhuoran Shan, Li Wan, Xianchun Zhang

Abstract:

With an increasing trend of regionalization and polycentricity in Chinese contemporary big cities, “sub-region life circle” turns to be an effective method on rational organization of urban function and spatial structure. By the method of questionnaire, network big data, route inversion on internet map, GIS spatial analysis and logistic regression, this article makes research on characteristic and formation mechanism of “sub-region life circle” based on human high-probability daily activity in Chinese big cities. Firstly, it shows that “sub-region life circle” has been a new general spatial sphere of residents' high-probability daily activity and mobility in China. Unlike the former analysis of the whole metropolitan or the micro community, “sub-region life circle” has its own characteristic on geographical sphere, functional element, spatial morphology and land distribution. Secondly, according to the analysis result with Binary Logistic Regression Model, the research also shows that seven factors including land-use mixed degree and bus station density impact the formation of “sub-region life circle” most, and then analyzes the index critical value of each factor. Finally, to establish a smarter “sub-region life circle”, this paper indicates that several strategies including jobs-housing fit, service cohesion and space reconstruction are the keys for its spatial organization optimization. This study expands the further understanding of cities' inner sub-region spatial structure based on human daily activity, and contributes to the theory of “life circle” in urban's meso-scale.

Keywords: sub-region life circle, characteristic, formation mechanism, human activity, spatial structure

Procedia PDF Downloads 295
24814 Towards a Secure Storage in Cloud Computing

Authors: Mohamed Elkholy, Ahmed Elfatatry

Abstract:

Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.

Keywords: access control, data integrity, data confidentiality, Kerberos authentication, cloud security

Procedia PDF Downloads 330
24813 Enhancing Financial Security: Real-Time Anomaly Detection in Financial Transactions Using Machine Learning

Authors: Ali Kazemi

Abstract:

The digital evolution of financial services, while offering unprecedented convenience and accessibility, has also escalated the vulnerabilities to fraudulent activities. In this study, we introduce a distinct approach to real-time anomaly detection in financial transactions, aiming to fortify the defenses of banking and financial institutions against such threats. Utilizing unsupervised machine learning algorithms, specifically autoencoders and isolation forests, our research focuses on identifying irregular patterns indicative of fraud within transactional data, thus enabling immediate action to prevent financial loss. The data we used in this study included the monetary value of each transaction. This is a crucial feature as fraudulent transactions may have distributions of different amounts than legitimate ones, such as timestamps indicating when transactions occurred. Analyzing transactions' temporal patterns can reveal anomalies (e.g., unusual activity in the middle of the night). Also, the sector or category of the merchant where the transaction occurred, such as retail, groceries, online services, etc. Specific categories may be more prone to fraud. Moreover, the type of payment used (e.g., credit, debit, online payment systems). Different payment methods have varying risk levels associated with fraud. This dataset, anonymized to ensure privacy, reflects a wide array of transactions typical of a global banking institution, ranging from small-scale retail purchases to large wire transfers, embodying the diverse nature of potentially fraudulent activities. By engineering features that capture the essence of transactions, including normalized amounts and encoded categorical variables, we tailor our data to enhance model sensitivity to anomalies. The autoencoder model leverages its reconstruction error mechanism to flag transactions that deviate significantly from the learned normal pattern, while the isolation forest identifies anomalies based on their susceptibility to isolation from the dataset's majority. Our experimental results, validated through techniques such as k-fold cross-validation, are evaluated using precision, recall, and the F1 score alongside the area under the receiver operating characteristic (ROC) curve. Our models achieved an F1 score of 0.85 and a ROC AUC of 0.93, indicating high accuracy in detecting fraudulent transactions without excessive false positives. This study contributes to the academic discourse on financial fraud detection and provides a practical framework for banking institutions seeking to implement real-time anomaly detection systems. By demonstrating the effectiveness of unsupervised learning techniques in a real-world context, our research offers a pathway to significantly reduce the incidence of financial fraud, thereby enhancing the security and trustworthiness of digital financial services.

Keywords: anomaly detection, financial fraud, machine learning, autoencoders, isolation forest, transactional data analysis

Procedia PDF Downloads 54
24812 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data

Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah

Abstract:

At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.

Keywords: Semantic Web, linked open data, database, statistic

Procedia PDF Downloads 173
24811 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 75
24810 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies

Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala

Abstract:

The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.

Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies

Procedia PDF Downloads 103
24809 Federated Learning in Healthcare

Authors: Ananya Gangavarapu

Abstract:

Convolutional Neural Networks (CNN) based models are providing diagnostic capabilities on par with the medical specialists in many specialty areas. However, collecting the medical data for training purposes is very challenging because of the increased regulations around data collections and privacy concerns around personal health data. The gathering of the data becomes even more difficult if the capture devices are edge-based mobile devices (like smartphones) with feeble wireless connectivity in rural/remote areas. In this paper, I would like to highlight Federated Learning approach to mitigate data privacy and security issues.

Keywords: deep learning in healthcare, data privacy, federated learning, training in distributed environment

Procedia PDF Downloads 138
24808 The Early Stages of the Standardisation of Finnish Building Sector

Authors: Anu Soikkeli

Abstract:

Early 20th century functionalism aimed at generalising living and rationalising construction, thus laying the foundation for the standardisation of construction components and products. From the 1930s onwards, all measurement and quality instructions for building products, different types of building components, descriptions of working methods complying with advisable building practises, planning, measurement and calculation guidelines, terminology, etc. were called standards. Standardisation was regarded as a necessary prerequisite for the mass production of housing. This article examines the early stages of standardisation in Finland in the 1940s and 1950s, as reflected on the working history of an individual architect, Erkki Koiso-Kanttila (1914-2006). In 1950 Koiso-Kanttila was appointed the Head of Design of the Finnish Association of Architects’ Building Standards Committee, a position which he held until 1958. His main responsibilities were the development of the RT Building Information File and compiling of the files.

Keywords: architecture, post WWII period, reconstruction, standardisation

Procedia PDF Downloads 412
24807 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 112
24806 Survey on Data Security Issues Through Cloud Computing Amongst Sme’s in Nairobi County, Kenya

Authors: Masese Chuma Benard, Martin Onsiro Ronald

Abstract:

Businesses have been using cloud computing more frequently recently because they wish to take advantage of its advantages. However, employing cloud computing also introduces new security concerns, particularly with regard to data security, potential risks and weaknesses that could be exploited by attackers, and various tactics and strategies that could be used to lessen these risks. This study examines data security issues on cloud computing amongst sme’s in Nairobi county, Kenya. The study used the sample size of 48, the research approach was mixed methods, The findings show that data owner has no control over the cloud merchant's data management procedures, there is no way to ensure that data is handled legally. This implies that you will lose control over the data stored in the cloud. Data and information stored in the cloud may face a range of availability issues due to internet outages; this can represent a significant risk to data kept in shared clouds. Integrity, availability, and secrecy are all mentioned.

Keywords: data security, cloud computing, information, information security, small and medium-sized firms (SMEs)

Procedia PDF Downloads 81
24805 Cloud Design for Storing Large Amount of Data

Authors: M. Strémy, P. Závacký, P. Cuninka, M. Juhás

Abstract:

Main goal of this paper is to introduce our design of private cloud for storing large amount of data, especially pictures, and to provide good technological backend for data analysis based on parallel processing and business intelligence. We have tested hypervisors, cloud management tools, storage for storing all data and Hadoop to provide data analysis on unstructured data. Providing high availability, virtual network management, logical separation of projects and also rapid deployment of physical servers to our environment was also needed.

Keywords: cloud, glusterfs, hadoop, juju, kvm, maas, openstack, virtualization

Procedia PDF Downloads 350
24804 Estimation of Missing Values in Aggregate Level Spatial Data

Authors: Amitha Puranik, V. S. Binu, Seena Biju

Abstract:

Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.

Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis

Procedia PDF Downloads 375
24803 Association Rules Mining and NOSQL Oriented Document in Big Data

Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub

Abstract:

Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.

Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL

Procedia PDF Downloads 156
24802 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia

Authors: Melaku Tsehay

Abstract:

The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.

Keywords: data quality, immunization, verification factor, pastoralist region

Procedia PDF Downloads 109
24801 Reconstruction of Signal in Plastic Scintillator of PET Using Tikhonov Regularization

Authors: L. Raczynski, P. Moskal, P. Kowalski, W. Wislicki, T. Bednarski, P. Bialas, E. Czerwinski, A. Gajos, L. Kaplon, A. Kochanowski, G. Korcyl, J. Kowal, T. Kozik, W. Krzemien, E. Kubicz, Sz. Niedzwiecki, M. Palka, Z. Rudy, O. Rundel, P. Salabura, N.G. Sharma, M. Silarski, A. Slomski, J. Smyrski, A. Strzelecki, A. Wieczorek, M. Zielinski, N. Zon

Abstract:

The J-PET scanner, which allows for single bed imaging of the whole human body, is currently under development at the Jagiellonian University. The J-PET detector improves the TOF resolution due to the use of fast plastic scintillators. Since registration of the waveform of signals with duration times of few nanoseconds is not feasible, a novel front-end electronics allowing for sampling in a voltage domain at four thresholds was developed. To take fully advantage of these fast signals a novel scheme of recovery of the waveform of the signal, based on ideas from the Tikhonov regularization (TR) and Compressive Sensing methods, is presented. The prior distribution of sparse representation is evaluated based on the linear transformation of the training set of waveform of the signals by using the Principal Component Analysis (PCA) decomposition. Beside the advantage of including the additional information from training signals, a further benefit of the TR approach is that the problem of signal recovery has an optimal solution which can be determined explicitly. Moreover, from the Bayes theory the properties of regularized solution, especially its covariance matrix, may be easily derived. This step is crucial to introduce and prove the formula for calculations of the signal recovery error. It has been proven that an average recovery error is approximately inversely proportional to the number of samples at voltage levels. The method is tested using signals registered by means of the single detection module of the J-PET detector built out from the 30 cm long BC-420 plastic scintillator strip. It is demonstrated that the experimental and theoretical functions describing the recovery errors in the J-PET scenario are largely consistent. The specificity and limitations of the signal recovery method in this application are discussed. It is shown that the PCA basis offers high level of information compression and an accurate recovery with just eight samples, from four voltage levels, for each signal waveform. Moreover, it is demonstrated that using the recovered waveform of the signals, instead of samples at four voltage levels alone, improves the spatial resolution of the hit position reconstruction. The experiment shows that spatial resolution evaluated based on information from four voltage levels, without a recovery of the waveform of the signal, is equal to 1.05 cm. After the application of an information from four voltage levels to the recovery of the signal waveform, the spatial resolution is improved to 0.94 cm. Moreover, the obtained result is only slightly worse than the one evaluated using the original raw-signal. The spatial resolution calculated under these conditions is equal to 0.93 cm. It is very important information since, limiting the number of threshold levels in the electronic devices to four, leads to significant reduction of the overall cost of the scanner. The developed recovery scheme is general and may be incorporated in any other investigation where a prior knowledge about the signals of interest may be utilized.

Keywords: plastic scintillators, positron emission tomography, statistical analysis, tikhonov regularization

Procedia PDF Downloads 444
24800 Identifying Critical Success Factors for Data Quality Management through a Delphi Study

Authors: Maria Paula Santos, Ana Lucas

Abstract:

Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.

Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort

Procedia PDF Downloads 211
24799 Adaptive Dehazing Using Fusion Strategy

Authors: M. Ramesh Kanthan, S. Naga Nandini Sujatha

Abstract:

The goal of haze removal algorithms is to enhance and recover details of scene from foggy image. In enhancement the proposed method focus into two main categories: (i) image enhancement based on Adaptive contrast Histogram equalization, and (ii) image edge strengthened Gradient model. Many circumstances accurate haze removal algorithms are needed. The de-fog feature works through a complex algorithm which first determines the fog destiny of the scene, then analyses the obscured image before applying contrast and sharpness adjustments to the video in real-time to produce image the fusion strategy is driven by the intrinsic properties of the original image and is highly dependent on the choice of the inputs and the weights. Then the output haze free image has reconstructed using fusion methodology. In order to increase the accuracy, interpolation method has used in the output reconstruction. A promising retrieval performance is achieved especially in particular examples.

Keywords: single image, fusion, dehazing, multi-scale fusion, per-pixel, weight map

Procedia PDF Downloads 461
24798 Design and Development of a Bi-Leaflet Pulmonary Valve

Authors: Munirah Ismail, Joon Hock Yeo

Abstract:

Paediatric patients who require ventricular outflow tract reconstruction usually need valve construction to prevent valvular regurgitation. They would face problems like lack of suitable, affordable conduits and the need to undergo several operations in their lifetime due to the short lifespan of existing valves. Their natural growth and development are also of concern, even if they manage to receive suitable conduits. Current prosthesis including homografts, bioprosthetic valves, mechanical valves, and bovine jugular veins either do not have the long-term durability or the ability to adapt to the growth of such patients. We have developed a new design of bi-leaflet valve. This new technique accommodates patients’ annular size growth while maintaining valvular patency. A mock circulatory system was set up to assess the hemodynamic performance of the bi-leaflet pulmonary valve. It was found that the percentage regurgitation was acceptable and thus, validates this novel concept.

Keywords: bi-leaflet pulmonary valve, pulmonary heart valve, tetralogy of fallot, mock circulatory system

Procedia PDF Downloads 158