Search results for: data exchange
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26255

Search results for: data exchange

25325 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 516
25324 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 379
25323 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 94
25322 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 491
25321 Modeling of Physico-Chemical Characteristics of Concrete for Filling Trenches in Radioactive Waste Management

Authors: Ilija Plecas, Dalibor Arbutina

Abstract:

The leaching rate of 60Co from spent mix bead (anion and cation) exchange resins in a cement-bentonite matrix has been studied. Transport phenomena involved in the leaching of a radioactive material from a cement-bentonite matrix are investigated using three methods based on theoretical equations. These are: the diffusion equation for a plane source, an equation for diffusion coupled to a first order equation and an empirical method employing a polynomial equation. The results presented in this paper are from a 25-year mortar and concrete testing project that will influence the design choices for radioactive waste packaging for a future Serbian radioactive waste disposal center.

Keywords: cement, concrete, immobilization, leaching, permeability, radioactivity, waste

Procedia PDF Downloads 323
25320 Corporate Social Responsibility, Earnings, and Tax Avoidance: Evidence from Indonesia

Authors: Cahyaningsih Cahyaningsih, Fu'ad Rakhman

Abstract:

This study examines empirically the association between corporate social responsibility (CSR) and tax avoidance. This study also investigates the effect of earnings on the relation between CSR and tax avoidance. Effective tax rate (ETR) and cash effective tax rate (CETR) were used to measure tax avoidance. Corporate social responsibility fund (CSRF) and corporate social responsibility disclosure (CSRD) were used as proxies for CSR. Test was conducted for public firms which were listed in the Indonesia Stock Exchange during the period of 2011-2014. Based on slack resource theory, this study finds that the relation between CSR and tax avoidance is moderated by earnings.

Keywords: corporate social responsibility disclosure, corporate social responsibility fund, earnings, tax avoidance

Procedia PDF Downloads 280
25319 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 244
25318 Exploring the Use of Universal Design for Learning to Support The Deaf Learners in Lesotho Secondary Schools: English Teachers Voice

Authors: Ntloyalefu Justinah, Fumane Khanare

Abstract:

English learning has been found as one of the prevalent areas of difficulty for Deaf learners. However, studies conducted indicated that this challenge experienced by Deaf learners is an upsetting concern globally as is blamed and hampered by various reasons such as the way English is taught at schools, lack of teachers ' skills and knowledge, therefore, impact negatively on their academic performance. Despite any difficulty in English learning, this language is considered nowadays as the key tool to an educational and occupational career especially in Lesotho. This paper, therefore, intends to contribute to the existing literature by providing the views of Lesotho English teachers, which focuses on how effectively Universal design for learning can be implemented to enhance the academic performance of Deaf learners in context of the English language classroom. The purpose of this study sought to explore the use of universal design for learning (UDL) to support Deaf learners in Lesotho Secondary schools. The present study is informed by interpretative paradigm and situated within a qualitative research approach. Ten participating English teachers from two inclusive schools were purposefully selected and telephonically interviewed to generate data for this study. The data were thematically analysed. The findings indicated that even though UDL is identified as highly proficient and promotes flexibility in teaching methods teachers reflect limited knowledge of the UDL approach. The findings further showed that UDL ensures education for all learners, including marginalised groups, such as learners with disabilities through different teaching strategies. This means that the findings signify the effective use of UDL for the better performance of the English language by Deaf learners (DLs). This aligns with literature that shows mobilizing English teachers as assets help DLs to be engaged and have control in their communities by defining and solving problems using their resources and connections to other networks for asset and exchange. The study, therefore, concludes that teachers acknowledge that even though they assume to be knowledgeable about the definition of UDL, they have a limited practice of the approach, thus they need to be equipped with some techniques and skills to apply for supporting the performance of DLs by using UDL approach in their English teaching. The researchers recommend the awareness of UDL principles by the ministry of Education and Training and teachers training Universities, as well as teachers training colleges, for them to include it in their curricula so that teachers could be properly trained on how to apply it in their teaching effectively

Keywords: deaf learners, Lesotho, support learning, universal design for learning

Procedia PDF Downloads 114
25317 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 481
25316 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 304
25315 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 505
25314 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 430
25313 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 291
25312 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India

Authors: Anushtha Saxena

Abstract:

This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.

Keywords: data monetization, e-commerce companies, regulatory framework, GDPR

Procedia PDF Downloads 120
25311 Research on Coordination Strategies for Coordinating Supply Chain Based on Auction Mechanisms

Authors: Changtong Wang, Lingyun Wei

Abstract:

The combination of auctions and supply chains is of great significance in improving the supply chain management system and enhancing the efficiency of economic and social operations. To address the gap in research on supply chain strategies under the auction mechanism, a model is developed for the 1-N auction model in a complete information environment, and it is concluded that the two-part contract auction model for retailers in this model can achieve supply chain coordination. The model is validated by substituting the model into the scenario of a fresh-cut flower industry flower auction in exchange for arithmetic examples to further prove the validity of the conclusions.

Keywords: auction mechanism, supply chain coordination strategy, fresh cut flowers industry, supply chain management

Procedia PDF Downloads 124
25310 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 200
25309 Investigating Student Behavior in Adopting Online Formative Assessment Feedback

Authors: Peter Clutterbuck, Terry Rowlands, Owen Seamons

Abstract:

In this paper we describe one critical research program within a complex, ongoing multi-year project (2010 to 2014 inclusive) with the overall goal to improve the learning outcomes for first year undergraduate commerce/business students within an Information Systems (IS) subject with very large enrolment. The single research program described in this paper is the analysis of student attitudes and decision making in relation to the availability of formative assessment feedback via Web-based real time conferencing and document exchange software (Adobe Connect). The formative assessment feedback between teaching staff and students is in respect of an authentic problem-based, team-completed assignment. The analysis of student attitudes and decision making is investigated via both qualitative (firstly) and quantitative (secondly) application of the Theory of Planned Behavior (TPB) with a two statistically-significant and separate trial samples of the enrolled students. The initial qualitative TPB investigation revealed that perceived self-efficacy, improved time-management, and lecturer-student relationship building were the major factors in shaping an overall favorable student attitude to online feedback, whilst some students expressed valid concerns with perceived control limitations identified within the online feedback protocols. The subsequent quantitative TPB investigation then confirmed that attitude towards usage, subjective norms surrounding usage, and perceived behavioral control of usage were all significant in shaping student intention to use the online feedback protocol, with these three variables explaining 63 percent of the variance in the behavioral intention to use the online feedback protocol. The identification in this research of perceived behavioral control as a significant determinant in student usage of a specific technology component within a virtual learning environment (VLE) suggests that VLEs could now be viewed not as a single, atomic entity, but as a spectrum of technology offerings ranging from the mature and simple (e.g., email, Web downloads) to the cutting-edge and challenging (e.g., Web conferencing and real-time document exchange). That is, that all VLEs should not be considered the same. The results of this research suggest that tertiary students have the technological sophistication to assess a VLE in this more selective manner.

Keywords: formative assessment feedback, virtual learning environment, theory of planned behavior, perceived behavioral control

Procedia PDF Downloads 400
25308 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 19
25307 Perspectives of Pre-service Teachers on Vocational Pedagogy in Technical and Vocational Education and Training Teaching

Authors: Siphokazi Vimbelo

Abstract:

Abstract - TVET colleges were established to equip students with the necessary skills for careers in a variety of fields such as business, tourism, engineering, and hospitality. TVET teachers are responsible for preparing students and for ensuring that students acquire these necessary skills. This calls for the pedagogies currently utilized in the TVET classroom. Occupational programmes that are being introduced in TVET colleges in South Africa will necessitate vocational pedagogy, which focuses on how individuals learn effectively in skill-oriented knowledge areas. Furthermore, limited research exists on the obstacles encountered by pre-service TVET educators as they develop vocational pedagogy-based lessons. Hence, this research will specifically examine the difficulties encountered by pre-service teachers in creating lesson plans rooted in vocational teaching methods. The preservice teachers are the students in the first year of the Advanced Diploma in Technical and Vocational Teaching (ADTVT). After brainstorming vocational pedagogy, the preservice teachers will develop lessons rooted in vocational pedagogy. Following that, the preservice teachers will participate in interviews to reflect on their lesson preparation process and discuss the challenges they encountered during the preparation. Thematic analysis will be used to analyse the data. It would be fascinating to discover the obstacles and exchange thoughts with academics from other Higher Education Institutions (HEIs) that also provide the same course, since it is a new program. Furthermore, the results will assist Cape Peninsula University of Technology (CPUT) academics in partnering with other academics to create various strategies for tackling challenges and determining priorities in implementing vocational education for a new student population.

Keywords: preservice teachers, TVET, TVET teaching, vocational pedagogy

Procedia PDF Downloads 74
25306 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads

Authors: Dražen Cvitanić, Biljana Maljković

Abstract:

This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.

Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency

Procedia PDF Downloads 452
25305 Experimental Investigation of Performance Anode Side of PEM Fuel Cell with Spin Method Coated with YSZ+SDC

Authors: Gürol Önal, Kevser Dinçer, Salih Yayla

Abstract:

In this study, performance of proton exchange membrane PEM fuel cell was experimentally investigated. Coating on the anode side of the PEM fuel cell was accomplished with the spin method by using YSZ+SDC. A solution having 0,1 gr YttriaStabilized Zirconia (YSZ) + 0,1 Samarium-Doped Ceria (SDC) + 10 mL methanol was prepared. This solution was taken out and filled into a micro-pipette. Then the anode side of PEM fuel cell was coated with YSZ+ SDC by using spin method. In the experimental study, current, voltage and power performances before and after coating were recorded and then compared to each other. It was found that the efficiency of PEM fuel cell increases after the coating with YSZ+SDC.

Keywords: fuel cell, Polymer Electrolyte Membrane (PEM), membrane, spin method

Procedia PDF Downloads 562
25304 Electronic, Magnetic and Optic Properties in Halide Perovskites CsPbX3 (X= F, Cl, I)

Authors: B. Bouadjemi, S. Bentata, T. Lantri, Souidi Amel, W.Bensaali, A. Zitouni, Z. Aziz

Abstract:

We performed first-principle calculations, the full-potential linearized augmented plane wave (FP-LAPW) method is used to calculate structural, optoelectronic and magnetic properties of cubic halide perovskites CsPbX3 (X= F,I). We employed for this study the GGA approach and for exchange is modeled using the modified Becke-Johnson (mBJ) potential to predicting the accurate band gap of these materials. The optical properties (namely: the real and imaginary parts of dielectric functions, optical conductivities and absorption coefficient absorption make this halide perovskites promising materials for solar cells applications.

Keywords: halide perovskites, mBJ, solar cells, FP-LAPW, optoelectronic properties, absorption coefficient

Procedia PDF Downloads 323
25303 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 275
25302 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast

Authors: Ruixia Liu

Abstract:

Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.

Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI

Procedia PDF Downloads 234
25301 Fostering Ties and Trusts through Social Interaction within Community Gardening

Authors: Shahida Mohd Sharif, Norsidah Ujang

Abstract:

Recent research has shown that many of the urban population in Kuala Lumpur, especially from the lower-income group, suffer from socio-psychological problems. They are reported as experiencing anxiety, depression, and stress, which is made worst by the recent COVID-19 pandemic. Much of the population was forced to observe the Movement Control Order (MCO), which is part of pandemic mitigation measures, pushing them to live in isolation as the new normal. The study finds the need to strategize for a better approach to help these people coping with the socio-psychological condition, especially the population from the lower-income group. In Kuala Lumpur, as part of the Local Agenda 21 programme, the Kuala Lumpur City Hall has introduced Green Initiative: Urban Farming, which among the approaches is the community garden. The local authority promotes the engagement to be capable of improving the social environment of the participants. Research has demonstrated that social interaction within community gardens can help the members improve their socio-psychological conditions. Therefore, the study explores the residents’ experience from low-cost flats participating in the community gardening initiative from a social attachment perspective. The study will utilise semi-structured interviews to collect the participants’ experience with community gardening and how the social interaction exchange between the members' forms and develop their ties and trust. For a context, the low-cost flats are part of the government social housing program (Program Perumahan Rakyat dan Perumahan Awam). Meanwhile, the community gardening initiative (Projek Kebun Kejiranan Bandar LA21 KL) is part of the local authority initiative to address the participants’ social, environmental, and economic issues. The study will conduct thematic analysis on the collected data and use the ATLAS.ti software for data organization and management purposes. The findings could help other researchers and stakeholders understand the social interaction experience within community gardens and its relation to ties and trusts. The findings could shed some light on how the participants could improve their social environment, and its report could provide the local authority with evidence-based documentation.

Keywords: community gardening participation, lower-income population, social attachment, social interaction

Procedia PDF Downloads 138
25300 Magnetic Properties of Layered Rare-Earth Oxy-Carbonates Ln2O2CO3 (Ln = Nd, Sm, and Dy)

Authors: U. Arjun, K. Brinda, M. Padmanabhan, R. Nath

Abstract:

Polycrystalline samples of rare-earth oxy-carbonates Ln2O2CO3 (Ln = Nd, Sm, and Dy) are synthesized, and their structural and magnetic properties are investigated. All of them crystallize in a hexagonal structure with space group P6_3/mmc. They form a double layered structure with frustrated triangular arrangement of rare-earth magnetic ions. An antiferromagnetic transition is observed at TN ≈ 1.25 K, 0.61 K, and 1.21 K for Nd2O2CO3, Sm2O2CO3, and Dy2O2CO3, respectively. From the analysis of magnetic susceptibility, the value of the Curie-Weiss temperature θ_CW is obtained to be ≈ 21.7 K, 18 K, and 10.6 K for Nd2O2CO3, Sm2O2CO3, and Dy2O2CO3, respectively. The magnetic frustration parameter f ( = |θ_CW|/T_N) is calculated to be ≈ 17.4, 31, and 8.8 for Nd2O2CO3, Sm2O2CO3, and Dy2O2CO3, respectively which indicates that Sm2O2CO3 is strongly frustrated compared to its Nd and Dy analogues.

Keywords: chemical synthesis, exchange and superexchange, heat capacity, magnetically ordered materials

Procedia PDF Downloads 357
25299 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies

Authors: Sook Ching Yee, Angela Siew Hoong Lee

Abstract:

Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.

Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)

Procedia PDF Downloads 362
25298 An Investigation into Fraud Detection in Financial Reporting Using Sugeno Fuzzy Classification

Authors: Mohammad Sarchami, Mohsen Zeinalkhani

Abstract:

Always, financial reporting system faces some problems to win public ear. The increase in the number of fraud and representation, often combined with the bankruptcy of large companies, has raised concerns about the quality of financial statements. So, investors, legislators, managers, and auditors have focused on significant fraud detection or prevention in financial statements. This article aims to investigate the Sugeno fuzzy classification to consider fraud detection in financial reporting of accepted firms by Tehran stock exchange. The hypothesis is: Sugeno fuzzy classification may detect fraud in financial reporting by financial ratio. Hypothesis was tested using Matlab software. Accuracy average was 81/80 in Sugeno fuzzy classification; so the hypothesis was confirmed.

Keywords: fraud, financial reporting, Sugeno fuzzy classification, firm

Procedia PDF Downloads 249
25297 A Multi-Criteria Decision Making Approach for Disassembly-To-Order Systems under Uncertainty

Authors: Ammar Y. Alqahtani

Abstract:

In order to minimize the negative impact on the environment, it is essential to manage the waste that generated from the premature disposal of end-of-life (EOL) products properly. Consequently, government and international organizations introduced new policies and regulations to minimize the amount of waste being sent to landfills. Moreover, the consumers’ awareness regards environment has forced original equipment manufacturers to consider being more environmentally conscious. Therefore, manufacturers have thought of different ways to deal with waste generated from EOL products viz., remanufacturing, reusing, recycling, or disposing of EOL products. The rate of depletion of virgin natural resources and their dependency on the natural resources can be reduced by manufacturers when EOL products are treated as remanufactured, reused, or recycled, as well as this will cut on the amount of harmful waste sent to landfills. However, disposal of EOL products contributes to the problem and therefore is used as a last option. Number of EOL need to be estimated in order to fulfill the components demand. Then, disassembly process needs to be performed to extract individual components and subassemblies. Smart products, built with sensors embedded and network connectivity to enable the collection and exchange of data, utilize sensors that are implanted into products during production. These sensors are used for remanufacturers to predict an optimal warranty policy and time period that should be offered to customers who purchase remanufactured components and products. Sensor-provided data can help to evaluate the overall condition of a product, as well as the remaining lives of product components, prior to perform a disassembly process. In this paper, a multi-period disassembly-to-order (DTO) model is developed that takes into consideration the different system uncertainties. The DTO model is solved using Nonlinear Programming (NLP) in multiple periods. A DTO system is considered where a variety of EOL products are purchased for disassembly. The model’s main objective is to determine the best combination of EOL products to be purchased from every supplier in each period which maximized the total profit of the system while satisfying the demand. This paper also addressed the impact of sensor embedded products on the cost of warranties. Lastly, this paper presented and analyzed a case study involving various simulation conditions to illustrate the applicability of the model.

Keywords: closed-loop supply chains, environmentally conscious manufacturing, product recovery, reverse logistics

Procedia PDF Downloads 137
25296 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 498