Search results for: data block
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26031

Search results for: data block

25311 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 406
25310 Dynamic Foot Pressure Measurement System Using Optical Sensors

Authors: Tanapon Keatsamarn, Chuchart Pintavirooj

Abstract:

Foot pressure measurement provides necessary information for diagnosis diseases, foot insole design, disorder prevention and other application. In this paper, dynamic foot pressure measurement is presented for pressure measuring with high resolution and accuracy. The dynamic foot pressure measurement system consists of hardware and software system. The hardware system uses a transparent acrylic plate and uses steel as the base. The glossy white paper is placed on the top of the transparent acrylic plate and covering with a black acrylic on the system to block external light. Lighting from LED strip entering around the transparent acrylic plate. The optical sensors, the digital cameras, are underneath the acrylic plate facing upwards. They have connected with software system to process and record foot pressure video in avi file. Visual Studio 2017 is used for software system using OpenCV library.

Keywords: foot, foot pressure, image processing, optical sensors

Procedia PDF Downloads 248
25309 Effect of Organic Manure on Production of Potato (Solanum tuberosum L.)

Authors: R. Behrooz, D. Jahanfar, D. Reza

Abstract:

Organic farming is a fundamental principle in sustainable agriculture. Preventing excessive contamination of water and soil with pesticides and chemical fertilizers is important in order to produce healthy food. For this purpose, two potato cultivars (Sante and Marfona) and seven levels of fertilizer (non-fertilizer, chemical fertilizer, granulated chicken manure, common manure, compost, vermicompost and tea compost) were evaluated by factorial experiment based on randomized complete block design (RCBD) with three replications. According to the results, the effect of different manure was significant on number of tubers per plant, tuber weight per plant and tuber yield. The highest value of these traits was obtained by using of chicken manure which was significantly superior to other treatments. However, there was no significant difference between the two varieties. According to the results, the use of chicken manure has produced the highest potato yield even in comparison with the use of chemical fertilizer.

Keywords: organic farming, organic manure, potato, tuber yield

Procedia PDF Downloads 157
25308 A Study of Blockchain Oracles

Authors: Abdeljalil Beniiche

Abstract:

The limitation with smart contracts is that they cannot access external data that might be required to control the execution of business logic. Oracles can be used to provide external data to smart contracts. An oracle is an interface that delivers data from external data outside the blockchain to a smart contract to consume. Oracle can deliver different types of data depending on the industry and requirements. In this paper, we study and describe the widely used blockchain oracles. Then, we elaborate on his potential role, technical architecture, and design patterns. Finally, we discuss the human oracle and its key role in solving the truth problem by reaching a consensus about a certain inquiry and tasks.

Keywords: blockchain, oracles, oracles design, human oracles

Procedia PDF Downloads 138
25307 Valorisation of Polyethylene and Plastic Bottle Wastes as Pavement Blocks

Authors: Babagana Mohammed, Fidelis Patrick Afangide

Abstract:

This research investigated the possibility of using waste low-dense polyethylene and waste plastic bottles for the production of interlock pavement blocks. In many parts of the world, interlock pavement block is used widely as modern day solution to outdoor flooring applications and the blocks have different shapes, sizes and colours suiting the imagination of landscape architects. Using suitable and conventional mould having a 220 x 135 x 50 mm³ shape, the interlock blocks were produced. The material constituents of the produced blocks were waste low-dense polyethylene and waste plastic bottles mixed in varying, respective percentage-weight proportions of; 100%+0%, 75%+25%, 50%+50% and 25%+75%. The blocks were then tested for unconfined compressive strength and water absorption properties. The test results compared well with those of conventional concrete interlock blocks and the research demonstrates the possibility of value recovery from the waste streams which are currently dumped in open-spaces thereby affecting the environment.

Keywords: pavement blocks, polyethylene, plastic bottle, wastes, valorization

Procedia PDF Downloads 405
25306 Multi Data Management Systems in a Cluster Randomized Trial in Poor Resource Setting: The Pneumococcal Vaccine Schedules Trial

Authors: Abdoullah Nyassi, Golam Sarwar, Sarra Baldeh, Mamadou S. K. Jallow, Bai Lamin Dondeh, Isaac Osei, Grant A. Mackenzie

Abstract:

A randomized controlled trial is the "gold standard" for evaluating the efficacy of an intervention. Large-scale, cluster-randomized trials are expensive and difficult to conduct, though. To guarantee the validity and generalizability of findings, high-quality, dependable, and accurate data management systems are necessary. Robust data management systems are crucial for optimizing and validating the quality, accuracy, and dependability of trial data. Regarding the difficulties of data gathering in clinical trials in low-resource areas, there is a scarcity of literature on this subject, which may raise concerns. Effective data management systems and implementation goals should be part of trial procedures. Publicizing the creative clinical data management techniques used in clinical trials should boost public confidence in the study's conclusions and encourage further replication. In the ongoing pneumococcal vaccine schedule study in rural Gambia, this report details the development and deployment of multi-data management systems and methodologies. We implemented six different data management, synchronization, and reporting systems using Microsoft Access, RedCap, SQL, Visual Basic, Ruby, and ASP.NET. Additionally, data synchronization tools were developed to integrate data from these systems into the central server for reporting systems. Clinician, lab, and field data validation systems and methodologies are the main topics of this report. Our process development efforts across all domains were driven by the complexity of research project data collected in real-time data, online reporting, data synchronization, and ways for cleaning and verifying data. Consequently, we effectively used multi-data management systems, demonstrating the value of creative approaches in enhancing the consistency, accuracy, and reporting of trial data in a poor resource setting.

Keywords: data management, data collection, data cleaning, cluster-randomized trial

Procedia PDF Downloads 28
25305 Expression Level of Dehydration-Responsive Element Binding/DREB Gene of Some Local Corn Cultivars from Kisar Island-Maluku Indonesia Using Quantitative Real-Time PCR

Authors: Hermalina Sinay, Estri L. Arumingtyas

Abstract:

The research objective was to determine the expression level of dehydration responsive element binding/DREB gene of local corn cultivars from Kisar Island Maluku. The study design was a randomized block design with single factor consist of six local corn cultivars obtained from farmers in Kisar Island and one reference varieties wich has been released by the government as a drought-tolerant varieties and obtained from Cereal Crops Research Institute (ICERI) Maros South Sulawesi. Leaf samples were taken is the second leaf after the flag leaf at the 65 days after planting. Isolation of total RNA from leaf samples was carried out according to the protocols of the R & A-BlueTM Total RNA Extraction Kit and was used as a template for cDNA synthesis. The making of cDNA from total RNA was carried out according to the protocol of One-Step Reverse Transcriptase PCR Premix Kit. Real Time-PCR was performed on cDNA from reverse transcription followed the procedures of Real MODTM Green Real-Time PCR Master Mix Kit. Data obtained from the real time-PCR results were analyzed using relative quantification method based on the critical point / Cycle Threshold (CP / CT). The results of gene expression analysis of DREB gene showed that the expression level of the gene was highest obtained at Deep Yellow local corn cultivar, and the lowest one was obtained at the Rubby Brown Cob cultivar. It can be concluded that the expression level of DREB gene of Deep Yellow local corn cultivar was highest than other local corn cultivars and Srikandi variety as a reference variety.

Keywords: expression, level, DREB gene, local corn cultivars, Kisar Island, Maluku

Procedia PDF Downloads 299
25304 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering

Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining

Abstract:

DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.

Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)

Procedia PDF Downloads 279
25303 Securing Health Monitoring in Internet of Things with Blockchain-Based Proxy Re-Encryption

Authors: Jerlin George, R. Chitra

Abstract:

The devices with sensors that can monitor your temperature, heart rate, and other vital signs and link to the internet, known as the Internet of Things (IoT), have completely transformed the way we control health. Providing real-time health data, these sensors improve diagnostics and treatment outcomes. Security and privacy matters when IoT comes into play in healthcare. Cyberattacks on centralized database systems are also a problem. To solve these challenges, the study uses blockchain technology coupled with proxy re-encryption to secure health data. ThingSpeak IoT cloud analyzes the collected data and turns them into blockchain transactions which are safely kept on the DriveHQ cloud. Transparency and data integrity are ensured by blockchain, and secure data sharing among authorized users is made possible by proxy re-encryption. This results in a health monitoring system that preserves the accuracy and confidentiality of data while reducing the safety risks of IoT-driven healthcare applications.

Keywords: internet of things, healthcare, sensors, electronic health records, blockchain, proxy re-encryption, data privacy, data security

Procedia PDF Downloads 19
25302 Rodriguez Diego, Del Valle Martin, Hargreaves Matias, Riveros Jose Luis

Authors: Nathainail Bashir, Neil Anderson

Abstract:

The objective of this study site was to investigate the current state of the practice with regards to karst detection methods and recommend the best method and pattern of arrays to acquire the desire results. Proper site investigation in karst prone regions is extremely valuable in determining the location of possible voids. Two geophysical techniques were employed: multichannel analysis of surface waves (MASW) and electric resistivity tomography (ERT).The MASW data was acquired at each test location using different array lengths and different array orientations (to increase the probability of getting interpretable data in karst terrain). The ERT data were acquired using a dipole-dipole array consisting of 168 electrodes. The MASW data was interpreted (re: estimated depth to physical top of rock) and used to constrain and verify the interpretation of the ERT data. The ERT data indicates poorer quality MASW data were acquired in areas where there was significant local variation in the depth to top of rock.

Keywords: dipole-dipole, ERT, Karst terrains, MASW

Procedia PDF Downloads 315
25301 Assessing the Ways of Improving the Power Saving Modes in the Ore-Grinding Technological Process

Authors: Baghdasaryan Marinka

Abstract:

Monitoring the distribution of electric power consumption in the technological process of ore grinding is conducted. As a result, the impacts of the mill filling rate, the productivity of the ore supply, the volumetric density of the grinding balls, the specific density of the ground ore, and the relative speed of the mill rotation on the specific consumption of electric power have been studied. The power and technological factors affecting the reactive power generated by the synchronous motors, operating within the technological scheme are studied. A block diagram for evaluating the power consumption modes of the technological process is presented, which includes the analysis of the technological scheme, the determination of the place and volumetric density of the ore-grinding mill, the evaluation of the technological and power factors affecting the energy saving process, as well as the assessment of the electric power standards.

Keywords: electric power standard, factor, ore grinding, power consumption, reactive power, technological

Procedia PDF Downloads 555
25300 Data Science in Military Decision-Making: A Semi-Systematic Literature Review

Authors: H. W. Meerveld, R. H. A. Lindelauf

Abstract:

In contemporary warfare, data science is crucial for the military in achieving information superiority. Yet, to the authors’ knowledge, no extensive literature survey on data science in military decision-making has been conducted so far. In this study, 156 peer-reviewed articles were analysed through an integrative, semi-systematic literature review to gain an overview of the topic. The study examined to what extent literature is focussed on the opportunities or risks of data science in military decision-making, differentiated per level of war (i.e. strategic, operational, and tactical level). A relatively large focus on the risks of data science was observed in social science literature, implying that political and military policymakers are disproportionally influenced by a pessimistic view on the application of data science in the military domain. The perceived risks of data science are, however, hardly addressed in formal science literature. This means that the concerns on the military application of data science are not addressed to the audience that can actually develop and enhance data science models and algorithms. Cross-disciplinary research on both the opportunities and risks of military data science can address the observed research gaps. Considering the levels of war, relatively low attention for the operational level compared to the other two levels was observed, suggesting a research gap with reference to military operational data science. Opportunities for military data science mostly arise at the tactical level. On the contrary, studies examining strategic issues mostly emphasise the risks of military data science. Consequently, domain-specific requirements for military strategic data science applications are hardly expressed. Lacking such applications may ultimately lead to a suboptimal strategic decision in today’s warfare.

Keywords: data science, decision-making, information superiority, literature review, military

Procedia PDF Downloads 169
25299 In Exile but Not at Peace: An Ethnography among Rwandan Army Deserters in South Africa

Authors: Florence Ncube

Abstract:

This paper examines the military and post-military experiences of soldiers who deserted from the Rwanda Defence Force (RDF) and tried to make a living in South Africa. Because they are deserters, they try to hide their military identity, yet it is simultaneously somewhat coercively ascribed to them by the Rwandan state and can put them in potential danger. The paper attends to the constructions, experiences, practices, and subjective understanding of the deserters’ being in exile to examine how, under circumstances of perceived threat, these men navigate real or perceived state-sponsored surveillance and threat in non-military settings in South Africa where they have become potential political and disciplinary targets. To make sense of the deserters’ experiences in these circumstances, the paper stitches together a number of useful theoretical concepts, including Bourdieu’s (1992) theory of practice and Vigh’s (2009; 2018) concept of social navigation because no single approach can coherently analyze the specificity of this study. Conventional post-military literature privileges an understanding of army desertion as a malignancy and somewhat problematic. Little is known about the military and post-military experiences of deserters who believe that army desertion is in fact a building block towards achieving subjective peace, even in the context of exile. The paper argues that the presence of Rwandan state agents in South Africa strips the context of the exile of its capacity to provide the deserters with peace, safety, and security. This paper recenters army desertion in analyses of militarism, soldiering, and transition in African contexts and complicates commonsense understandings of army desertion which assume that it is entirely problematic. This paper is drawn from an ethnography conducted among 30 junior-rank Rwandan army deserters exiled in Johannesburg and Cape Town. The researcher employed life histories, in-depth interviews, and deep hangouts to collect data.

Keywords: army deserter, military, identity, exile, peacebuilding, South Africa

Procedia PDF Downloads 71
25298 A New Design Methodology for Partially Reconfigurable Systems-on-Chip

Authors: Roukaya Dalbouchi, Abdelkrin Zitouni

Abstract:

In this paper, we propose a novel design methodology for Dynamic Partial Reconfigurable (DPR) system. This type of system has the property of being able to be modified after its design and during its execution. The suggested design methodology is generic in terms of granularity, number of modules, and reconfigurable region and suitable for any type of modern application. It is based on the interconnection between several design stages. The recommended methodology represents a guide for the design of DPR architectures that meet compromise reconfiguration/performance. To validate the proposed methodology, we use as an application a video watermarking. The comparison result shows that the proposed methodology supports all stages of DPR architecture design and characterized by a high abstraction level. It provides a dynamic/partial reconfigurable architecture; it guarantees material efficiency, the flexibility of reconfiguration, and superior performance in terms of frequency and power consumption.

Keywords: dynamically reconfigurable system, block matching algorithm, partial reconfiguration, motion vectors, video watermarking

Procedia PDF Downloads 95
25297 Legal Regulation of Personal Information Data Transmission Risk Assessment: A Case Study of the EU’s DPIA

Authors: Cai Qianyi

Abstract:

In the midst of global digital revolution, the flow of data poses security threats that call China's existing legislative framework for protecting personal information into question. As a preliminary procedure for risk analysis and prevention, the risk assessment of personal data transmission lacks detailed guidelines for support. Existing provisions reveal unclear responsibilities for network operators and weakened rights for data subjects. Furthermore, the regulatory system's weak operability and a lack of industry self-regulation heighten data transmission hazards. This paper aims to compare the regulatory pathways for data information transmission risks between China and Europe from a legal framework and content perspective. It draws on the “Data Protection Impact Assessment Guidelines” to empower multiple stakeholders, including data processors, controllers, and subjects, while also defining obligations. In conclusion, this paper intends to solve China's digital security shortcomings by developing a more mature regulatory framework and industry self-regulation mechanisms, resulting in a win-win situation for personal data protection and the development of the digital economy.

Keywords: personal information data transmission, risk assessment, DPIA, internet service provider, personal information data transimission, risk assessment

Procedia PDF Downloads 62
25296 Wavelets Contribution on Textual Data Analysis

Authors: Habiba Ben Abdessalem

Abstract:

The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.

Keywords: textual data, wavelet, denoising, contingency table

Procedia PDF Downloads 278
25295 Influence of Surface Fault Rupture on Dynamic Behavior of Cantilever Retaining Wall: A Numerical Study

Authors: Partha Sarathi Nayek, Abhiparna Dasgupta, Maheshreddy Gade

Abstract:

Earth retaining structure plays a vital role in stabilizing unstable road cuts and slopes in the mountainous region. The retaining structures located in seismically active regions like the Himalayas may experience moderate to severe earthquakes. An earthquake produces two kinds of ground motion: permanent quasi-static displacement (fault rapture) on the fault rupture plane and transient vibration, traveling a long distance. There has been extensive research work to understand the dynamic behavior of retaining structures subjected to transient ground motions. However, understanding the effect caused by fault rapture phenomena on retaining structures is limited. The presence of shallow crustal active faults and natural slopes in the Himalayan region further highlights the need to study the response of retaining structures subjected to fault rupture phenomena. In this paper, an attempt has been made to understand the dynamic response of the cantilever retaining wall subjected to surface fault rupture. For this purpose, a 2D finite element model consists of a retaining wall, backfill and foundation have been developed using Abaqus 6.14 software. The backfill and foundation material are modeled as per the Mohr-Coulomb failure criterion, and the wall is modeled as linear elastic. In this present study, the interaction between backfill and wall is modeled as ‘surface-surface contact.’ The entire simulation process is divided into three steps, i.e., the initial step, gravity load step, fault rupture step. The interaction property between wall and soil and fixed boundary condition to all the boundary elements are applied in the initial step. In the next step, gravity load is applied, and the boundary elements are allowed to move in the vertical direction to incorporate the settlement of soil due to the gravity load. In the final step, surface fault rupture has been applied to the wall-backfill system. For this purpose, the foundation is divided into two blocks, namely, the hanging wall block and the footwall block. A finite fault rupture displacement is applied to the hanging wall part while the footwall bottom boundary is kept as fixed. Initially, a numerical analysis is performed considering the reverse fault mechanism with a dip angle of 45°. The simulated result is presented in terms of contour maps of permanent displacements of the wall-backfill system. These maps highlighted that surface fault rupture can induce permanent displacement in both horizontal and vertical directions, which can significantly influence the dynamic behavior of the wall-backfill system. Further, the influence of fault mechanism, dip angle, and surface fault rupture position is also investigated in this work.

Keywords: surface fault rupture, retaining wall, dynamic response, finite element analysis

Procedia PDF Downloads 106
25294 Customer Churn Analysis in Telecommunication Industry Using Data Mining Approach

Authors: Burcu Oralhan, Zeki Oralhan, Nilsun Sariyer, Kumru Uyar

Abstract:

Data mining has been becoming more and more important and a wide range of applications in recent years. Data mining is the process of find hidden and unknown patterns in big data. One of the applied fields of data mining is Customer Relationship Management. Understanding the relationships between products and customers is crucial for every business. Customer Relationship Management is an approach to focus on customer relationship development, retention and increase on customer satisfaction. In this study, we made an application of a data mining methods in telecommunication customer relationship management side. This study aims to determine the customers profile who likely to leave the system, develop marketing strategies, and customized campaigns for customers. Data are clustered by applying classification techniques for used to determine the churners. As a result of this study, we will obtain knowledge from international telecommunication industry. We will contribute to the understanding and development of this subject in Customer Relationship Management.

Keywords: customer churn analysis, customer relationship management, data mining, telecommunication industry

Procedia PDF Downloads 318
25293 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis

Authors: N. R. N. Idris, S. Baharom

Abstract:

A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.

Keywords: aggregate data, combined-level data, individual patient data, meta-analysis

Procedia PDF Downloads 375
25292 Analyzing On-Line Process Data for Industrial Production Quality Control

Authors: Hyun-Woo Cho

Abstract:

The monitoring of industrial production quality has to be implemented to alarm early warning for unusual operating conditions. Furthermore, identification of their assignable causes is necessary for a quality control purpose. For such tasks many multivariate statistical techniques have been applied and shown to be quite effective tools. This work presents a process data-based monitoring scheme for production processes. For more reliable results some additional steps of noise filtering and preprocessing are considered. It may lead to enhanced performance by eliminating unwanted variation of the data. The performance evaluation is executed using data sets from test processes. The proposed method is shown to provide reliable quality control results, and thus is more effective in quality monitoring in the example. For practical implementation of the method, an on-line data system must be available to gather historical and on-line data. Recently large amounts of data are collected on-line in most processes and implementation of the current scheme is feasible and does not give additional burdens to users.

Keywords: detection, filtering, monitoring, process data

Procedia PDF Downloads 559
25291 A Review of Travel Data Collection Methods

Authors: Muhammad Awais Shafique, Eiji Hato

Abstract:

Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.

Keywords: computer, smartphone, telephone, travel survey

Procedia PDF Downloads 314
25290 A Novel Search Pattern for Motion Estimation in High Efficiency Video Coding

Authors: Phong Nguyen, Phap Nguyen, Thang Nguyen

Abstract:

High Efficiency Video Coding (HEVC) or H.265 Standard fulfills the demand of high resolution video storage and transmission since it achieves high compression ratio. However, it requires a huge amount of calculation. Since Motion Estimation (ME) block composes about 80 % of calculation load of HEVC, there are a lot of researches to reduce the computation cost. In this paper, we propose a new algorithm to lower the number of Motion Estimation’s searching points. The number of computing points in search pattern is down from 77 for Diamond Pattern and 81 for Square Pattern to only 31. Meanwhile, the Peak Signal to Noise Ratio (PSNR) and bit rate are almost equal to those of conventional patterns. The motion estimation time of new algorithm reduces by at 68.23%, 65.83%compared to the recommended search pattern of diamond pattern, square pattern, respectively.

Keywords: motion estimation, wide diamond, search pattern, H.265, test zone search, HM software

Procedia PDF Downloads 613
25289 A Sociological Investigation on the Population and Public Spaces of Nguyen Cong Tru, a Soviet-Style Collective Housing Complex in Hanoi in Regards to Its New Community-Focused Architectural Design

Authors: Duy Nguyen Do, Bart Julien Dewancker

Abstract:

Many Soviet-style collective housing complexes (also known as KTT) were built since the 1960s in Hanoi to support the post-war population growth. Those low-rise buildings have created well-knitted, robust communities, so much to the point that in most complexes, all families in one housing block would know each other, occasionally interact and provide supports in need. To understand how the community of collective housing complexes have developed and maintained in order to adapt their advantages into modern housing designs, the study is executed on the site of Nguyen Cong Tru KTT. This is one of the oldest KTT in Hanoi, completed in 1954. The complex also has an unique characteristic that is closely related to its community: the symbiotic relationship with Hom – a flea market that has been co-developing with Nguyen Cong Tru KTT since its beginning. The research consists of three phases: the first phase is a sociological investigation with Nguyen Cong Tru KTT’s current residents and a site survey on the complex’s economic and architectural characteristics. In the second phase, the collected data is analyzed to find out people’s opinions with the KTT’s concerning their satisfaction with the current housing status, floor plan organization, community, the relationship between the KTT’s dedicated public spaces with the flea market and their usage. Simultaneously, the master plan and gathered information regarding current architectural characteristics of the complex are also inspected. On the third phase, the analyses’ results will provide information regarding the issues, positive trends and significant historical features of the complex’s architecture in order to generate suitable proposals for the redesigning project of Nguyen Cong Tru KTT, a design focused on vitalizing modern apartments’ communities.

Keywords: collective house community, collective house public space, community-focused, redesigning Nguyen Cong Tru KTT, sociological investigation

Procedia PDF Downloads 364
25288 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain

Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami

Abstract:

To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. In the blockchain mechanism such as Bitcoin using PKI (Public Key Infrastructure), in order to confirm the identity of the company that has sent the data, the plaintext must be shared between the companies. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is a top secret. In this scenario, we show a implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.

Keywords: business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption

Procedia PDF Downloads 138
25287 Multivariate Assessment of Mathematics Test Scores of Students in Qatar

Authors: Ali Rashash Alzahrani, Elizabeth Stojanovski

Abstract:

Data on various aspects of education are collected at the institutional and government level regularly. In Australia, for example, students at various levels of schooling undertake examinations in numeracy and literacy as part of NAPLAN testing, enabling longitudinal assessment of such data as well as comparisons between schools and states within Australia. Another source of educational data collected internationally is via the PISA study which collects data from several countries when students are approximately 15 years of age and enables comparisons in the performance of science, mathematics and English between countries as well as ranking of countries based on performance in these standardised tests. As well as student and school outcomes based on the tests taken as part of the PISA study, there is a wealth of other data collected in the study including parental demographics data and data related to teaching strategies used by educators. Overall, an abundance of educational data is available which has the potential to be used to help improve educational attainment and teaching of content in order to improve learning outcomes. A multivariate assessment of such data enables multiple variables to be considered simultaneously and will be used in the present study to help develop profiles of students based on performance in mathematics using data obtained from the PISA study.

Keywords: cluster analysis, education, mathematics, profiles

Procedia PDF Downloads 127
25286 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators

Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros

Abstract:

Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.

Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis

Procedia PDF Downloads 142
25285 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 142
25284 Application of Blockchain on Manufacturing Process Control and Pricing Policy

Authors: Chieh Lee

Abstract:

Today, supply chain managers face extensive disruptions in raw material pricing, transportation block, and quality issue due to product complexity. While digitalization might help managers to mitigate the disruption risk and increase supply chain resilience by sharing information between sellers and buyers through the supply chain, entities are reluctant to build such a system. The main reason is it is not clear what information should be shared and who has access to the stored information. In this research, we propose a smart contract built by blockchain technology. This contract helps both buyer and seller to identify the type of information, the access to the information, and how to trace the information. This contract helps managers control their orders through the supply chain and address any disruption they see fit. Furthermore, with the same smart contract, the supplier can track the production process of an order and increase production efficiency by eliminating waste.

Keywords: blockchain, production process, smart contract, supply chain resilience

Procedia PDF Downloads 80
25283 Canopy Temperature Acquired from Daytime and Nighttime Aerial Data as an Indicator of Trees’ Health Status

Authors: Agata Zakrzewska, Dominik Kopeć, Adrian Ochtyra

Abstract:

The growing number of new cameras, sensors, and research methods allow for a broader application of thermal data in remote sensing vegetation studies. The aim of this research was to check whether it is possible to use thermal infrared data with a spectral range (3.6-4.9 μm) obtained during the day and the night to assess the health condition of selected species of deciduous trees in an urban environment. For this purpose, research was carried out in the city center of Warsaw (Poland) in 2020. During the airborne data acquisition, thermal data, laser scanning, and orthophoto map images were collected. Synchronously with airborne data, ground reference data were obtained for 617 studied species (Acer platanoides, Acer pseudoplatanus, Aesculus hippocastanum, Tilia cordata, and Tilia × euchlora) in different health condition states. The results were as follows: (i) healthy trees are cooler than trees in poor condition and dying both in the daytime and nighttime data; (ii) the difference in the canopy temperatures between healthy and dying trees was 1.06oC of mean value on the nighttime data and 3.28oC of mean value on the daytime data; (iii) condition classes significantly differentiate on both daytime and nighttime thermal data, but only on daytime data all condition classes differed statistically significantly from each other. In conclusion, the aerial thermal data can be considered as an alternative to hyperspectral data, a method of assessing the health condition of trees in an urban environment. Especially data obtained during the day, which can differentiate condition classes better than data obtained at night. The method based on thermal infrared and laser scanning data fusion could be a quick and efficient solution for identifying trees in poor health that should be visually checked in the field.

Keywords: middle wave infrared, thermal imagery, tree discoloration, urban trees

Procedia PDF Downloads 116
25282 Block Based Imperial Competitive Algorithm with Greedy Search for Traveling Salesman Problem

Authors: Meng-Hui Chen, Chiao-Wei Yu, Pei-Chann Chang

Abstract:

Imperial competitive algorithm (ICA) simulates a multi-agent algorithm. Each agent is like a kingdom has its country, and the strongest country in each agent is called imperialist, others are colony. Countries are competitive with imperialist which in the same kingdom by evolving. So this country will move in the search space to find better solutions with higher fitness to be a new imperialist. The main idea in this paper is using the peculiarity of ICA to explore the search space to solve the kinds of combinational problems. Otherwise, we also study to use the greed search to increase the local search ability. To verify the proposed algorithm in this paper, the experimental results of traveling salesman problem (TSP) is according to the traveling salesman problem library (TSPLIB). The results show that the proposed algorithm has higher performance than the other known methods.

Keywords: traveling salesman problem, artificial chromosomes, greedy search, imperial competitive algorithm

Procedia PDF Downloads 459