Search results for: data exchange
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26254

Search results for: data exchange

25504 Design, Construction and Evaluation of a Mechanical Vapor Compression Distillation System for Wastewater Treatment in a Poultry Company

Authors: Juan S. Vera, Miguel A. Gomez, Omar Gelvez

Abstract:

Water is Earth's most valuable resource, and the lack of it is currently a critical problem in today’s society. Non-treated wastewaters contribute to this situation, especially those coming from industrial activities, as they reduce the quality of the water bodies, annihilating all kind of life and bringing disease to people in contact with them. An effective solution for this problem is distillation, which removes most contaminants. However, this approach must also be energetically efficient in order to appeal to the industry. In this endeavour, most water distillation treatments fail, with the exception of the Mechanical Vapor Compression (MVC) distillation system, which has a great efficiency due to energy input by a compressor and the latent heat exchange. This paper presents the process of design, construction, and evaluation of a Mechanical Vapor Compression (MVC) distillation system for the main Colombian poultry company Avidesa Macpollo SA. The system will be located in the principal slaughterhouse in the state of Santander, and it will work along with the Gas Energy Mixing system (GEM) to treat the wastewaters from the plant. The main goal of the MVC distiller, rarely used in this type of application, is to reduce the chlorides, Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD) levels according to the state regulations since the GEM cannot decrease them enough. The MVC distillation system works with three components, the evaporator/condenser heat exchanger where the distillation takes place, a low-pressure compressor which gives the energy to create the temperature differential between the evaporator and condenser cavities and a preheater to save the remaining energy in the distillate. The model equations used to describe how the compressor power consumption, heat exchange area and distilled water are related is based on a thermodynamic balance and heat transfer analysis, with correlations taken from the literature. Finally, the design calculations and the measurements of the installation are compared, showing accordance with the predictions in distillate production and power consumption, changing the temperature difference of the evaporator/condenser.

Keywords: mechanical vapor compression, distillation, wastewater, design, construction, evaluation

Procedia PDF Downloads 159
25503 Performance of Shariah-Based Investment: Evidence from Pakistani Listed Firms

Authors: Mohsin Sadaqat, Hilal Anwar Butt

Abstract:

Following the stock selection guidelines provided by the Sharia Board (SB), we segregate the firms listed at Pakistan Stock Exchange (PSX) into Sharia Compliant (SC) and Non-Sharia Compliant (NSC) stocks. Subsequently, we form portfolios within each group based on market capitalization and volatility. The purpose is to analyze and compare the performance of these two groups as the SC stocks have lesser diversification opportunities due to SB restrictions. Using data ranging from January 2004 until June 2016, our results indicate that in most of the cases the risk-adjusted returns (alphas) for the returns differential between SC and NCS firms are positive. In addition, the SC firms in comparison to their counterparts in PSX provides excess returns that are hedged against the market, size, and value-based systematic risks factors. Overall, these results reconcile with one prevailing notion that the SC stocks that have lower financial leverage and higher investment in real assets are lesser exposed to market-based risks. Further, the SC firms that are more capitalized and less volatile, perform better than lower capitalized and higher volatile SC and NSC firms. To sum up our results, we do not find any substantial evidence for opportunity loss due to limited diversification opportunities in case of SC firms. To optimally utilize scarce resources, investors should consider SC firms as a candidate in portfolio construction.

Keywords: diversification, performance, sharia compliant stocks, risk adjusted returns

Procedia PDF Downloads 199
25502 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine

Authors: Djamila Benhaddouche, Abdelkader Benyettou

Abstract:

In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.

Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction

Procedia PDF Downloads 560
25501 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease

Authors: Usama Ahmed

Abstract:

Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.

Keywords: data mining, classification, diabetes, WEKA

Procedia PDF Downloads 147
25500 Electronic and Optical Properties of YNi4Si-Type DyNi4Si Compound: A Full Potential Study

Authors: Dinesh Kumar Maurya, Sapan Mohan Saini

Abstract:

A theoretical formalism to calculate the structural, electronic and optical properties of orthorhombic crystals from first principle calculations is described. This is applied first time to new YNi4Si-type DyNi4Si compound. Calculations are performed using full-potential augmented plane wave (FPLAPW) method in the framework of density functional theory (DFT). The Coulomb corrected local-spin density approximation (LSDA+U) in the self-interaction correction (SIC) has been used for exchange-correlation potential. Our optimized results of lattice parameters show good agreement to the previously reported experimental study. Analysis of the calculated band structure of DyNi4Si compound demonstrates their metallic character. We found Ni-3d states mainly contribute to density of states from -5.0 eV to the Fermi level while the Dy-f states peak stands tall in comparison to the small contributions made by the Ni-d and R-d states above Fermi level, which is consistent with experiment, in DNi4Si compound. Our calculated optical conductivity compares well with the experimental data and the results are analyzed in the light of band-to-band transitions. We also report the frequency-dependent refractive index n(ω) and the extinction coefficient k(ω) of the compound.

Keywords: band structure, density of states, optical properties, LSDA+U approximation, YNi4Si- type DyNi4Si compound

Procedia PDF Downloads 350
25499 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge

Authors: K. Eryilmaz, G. Mercanoglu

Abstract:

Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.

Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis

Procedia PDF Downloads 164
25498 Focus Group Study Exploring Researchers Perspective on Open Science Policy

Authors: E. T. Svahn

Abstract:

Knowledge about the factors that influence the exchange between research and society is of the utmost importance for developing collaboration between different actors, especially in future science policy development and the creation of support structures for researchers. Among other things, how researchers look at the surrounding open science policy environment and what conditions and attitudes they have for interacting with it. This paper examines the Finnish researchers' attitudes towards open science policies in 2020. Open science is an integrated part of researchers' daily lives and supports not only the effectiveness of research outputs but also the quality of research. Open science policy in ideal situation is seen as a supporting structure that enables the exchange between research and society, but in other situation, it can end up being red tape generating obstacles and hindering possibilities of making science in an efficient way. Results of this study were carried out through focus group interviews. This qualitative research method was selected because it aims to understand the phenomenon under study. In addition, focus group interviews produce diverse and rich material that would not be available with other research methods. Focus group interviews have well-established applications in social science, especially in understanding the perspectives and experiences of research subjects. In this study, focus groups were used in studying the mindset and actions of researchers. Each group's size was between 4-10 people, and the aim was to bring out different perspectives on the subject. The interviewer enabled the presentation of different perceptions and opinions, and the focus group interviews were recorded and written as text. The material was analysed using grounded theory method. The results are presented as thematic areas, theoretical model, and as direct quotations. Attitudes towards open science policy can vary greatly depending on the research area. This study shows that the open science policy demands in medicine, technology, and natural sciences compared to social sciences, educational sciences, and the humanities, varies somewhat. The variation in attitudes between different research areas can thus be largely explained by the fact that the research output and ethical code vary significantly between certain subjects. This study aims to increase understanding of the nuances to what extent open science policies should be tailored for different disciplines and research areas.

Keywords: focus group interview, grounded theory, open science policy, science policy

Procedia PDF Downloads 156
25497 Comprehensive Study of Data Science

Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly

Abstract:

Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.

Keywords: data science, machine learning, data analytics, artificial intelligence

Procedia PDF Downloads 84
25496 Identification and Characterization of Small Peptides Encoded by Small Open Reading Frames using Mass Spectrometry and Bioinformatics

Authors: Su Mon Saw, Joe Rothnagel

Abstract:

Short open reading frames (sORFs) located in 5’UTR of mRNAs are known as uORFs. Characterization of uORF-encoded peptides (uPEPs) i.e., a subset of short open reading frame encoded peptides (sPEPs) and their translation regulation lead to understanding of causes of genetic disease, proteome complexity and development of treatments. Existence of uORFs within cellular proteome could be detected by LC-MS/MS. The ability of uORF to be translated into uPEP and achievement of uPEP identification will allow uPEP’s characterization, structures, functions, subcellular localization, evolutionary maintenance (conservation in human and other species) and abundance in cells. It is hypothesized that a subset of sORFs are translatable and that their encoded sPEPs are functional and are endogenously expressed contributing to the eukaryotic cellular proteome complexity. This project aimed to investigate whether sORFs encode functional peptides. Liquid chromatography-mass spectrometry (LC-MS) and bioinformatics were thus employed. Due to probable low abundance of sPEPs and small in sizes, the need for efficient peptide enrichment strategies for enriching small proteins and depleting the sub-proteome of large and abundant proteins is crucial for identifying sPEPs. Low molecular weight proteins were extracted using SDS-PAGE from Human Embryonic Kidney (HEK293) cells and Strong Cation Exchange Chromatography (SCX) from secreted HEK293 cells. Extracted proteins were digested by trypsin to peptides, which were detected by LC-MS/MS. The MS/MS data obtained was searched against Swiss-Prot using MASCOT version 2.4 to filter out known proteins, and all unmatched spectra were re-searched against human RefSeq database. ProteinPilot v5.0.1 was used to identify sPEPs by searching against human RefSeq, Vanderperre and Human Alternative Open Reading Frame (HaltORF) databases. Potential sPEPs were analyzed by bioinformatics. Since SDS PAGE electrophoresis could not separate proteins <20kDa, this could not identify sPEPs. All MASCOT-identified peptide fragments were parts of main open reading frame (mORF) by ORF Finder search and blastp search. No sPEP was detected and existence of sPEPs could not be identified in this study. 13 translated sORFs in HEK293 cells by mass spectrometry in previous studies were characterized by bioinformatics. Identified sPEPs from previous studies were <100 amino acids and <15 kDa. Bioinformatics results showed that sORFs are translated to sPEPs and contribute to proteome complexity. uPEP translated from uORF of SLC35A4 was strongly conserved in human and mouse while uPEP translated from uORF of MKKS was strongly conserved in human and Rhesus monkey. Cross-species conserved uORFs in association with protein translation strongly suggest evolutionary maintenance of coding sequence and indicate probable functional expression of peptides encoded within these uORFs. Translation of sORFs was confirmed by mass spectrometry and sPEPs were characterized with bioinformatics.

Keywords: bioinformatics, HEK293 cells, liquid chromatography-mass spectrometry, ProteinPilot, Strong Cation Exchange Chromatography, SDS-PAGE, sPEPs

Procedia PDF Downloads 188
25495 Non-Isothermal Stationary Laminar Oil Flow Numerical Simulation

Authors: Daniyar Bossinov

Abstract:

This paper considers a non-isothermal stationary waxy crude oil flow in a two-dimensional axisymmetric pipe with the transition of a Newtonian fluid to a non-Newtonian fluid. The viscosity and yield stress of waxy crude oil are highly dependent on temperature changes. During the hot pumping of waxy crude oil through a buried pipeline, a non-isothermal flow occurs due to heat transfer to the surrounding soil. This leads to a decrease in flow temperature, an increase in viscosity, the appearance of yield stress, the crystallization of wax, and the deposition of solid particles on the pipeline's inner wall. The deposition of oil solid particles reduces a pipeline flow area and leads to the appearance of a stagnant zone with thermal insulation in the near-wall area. Waxy crude oil properties change. A Newtonian fluid at low temperatures transits to a non-Newtonian fluid. The one-dimensional modeling of a non-isothermal waxy crude oil flow in a two-dimensional axisymmetric pipeline by traditional averaging of temperature and velocity over the pipeline cross-section does not allow for explaining a physics phenomenon. Therefore, in this work, a two-dimensional flow model and the heat transfer of waxy oil are constructed. The calculated data show the transition of a Newtonian fluid to a non-Newtonian fluid due to the heat exchange of waxy oil with the environment.

Keywords: non-isothermal laminar flow, waxy crude oil, stagnant zone, yield stress

Procedia PDF Downloads 29
25494 Elderly Health Care Process by Community Participation: A Sub-District in the Lower Northern Region of Thailand

Authors: Amaraporn Puraya, Roongtiva Boonpracom, Somsak Thojampa, Sirikanok Klankhajhon, Kittisak Kumpeera

Abstract:

The objective of this qualitative research was to study the elderly health care process by community participation. Data were collected by quality research methods, including secondary data study, observation, in-depth interviews, and focus group discussions and analyzed by content analysis, reflection and review of information. The research results pointed out that the important elderly health care process by community participation consisted of 2 parts, namely the community participation development process in elderly health care and the outcomes from the participation development process. The community participation development process consisted of 4 steps as follows: 1) Building the leadership team, an important social capital of the community, which started from searching for both formal and informal leaders by giving the opportunity for public participation and creating clear agreements defining roles, duties and responsibilities; 2) investigating the problems and the needs of the community, 3) designing the elderly health care activities under the concept of self-care potential development of the elderly through participation in community forums and meetings to exchange knowledge with common goals, plans and operation and 4) the development process of sustainable health care agreement at the local level, starting from opening communication channels to create awareness and participation in various activities at both individual and group levels as well as pushing activities/projects into the community development plan consistent with the local administration policy. The outcomes from the participation development process were as follows. 1) There was the integration of the elderly for doing the elderly health care activities/projects in the community managed by the elderly themselves. 2) The service system was changed from the passive to the proactive one, focusing on health promotion rather than treating diseases or illnesses. 3) The registered nurses / the public health officers can provide care for the elderly with chronic illnesses through the implementation of activities/projects of elderly health care so that the elderly can access the services more. 4) The local government organization became the main mechanism in driving the elderly health care process by community participation.

Keywords: elderly health care process, community participation, elderly, Thailand

Procedia PDF Downloads 214
25493 Causal Relationship between Macro-Economic Indicators and Fund Unit Price Behaviour: Evidence from Malaysian Equity Unit Trust Fund Industry

Authors: Anwar Hasan Abdullah Othman, Ahamed Kameel, Hasanuddeen Abdul Aziz

Abstract:

In this study, an attempt has been made to investigate the relationship specifically the causal relation between fund unit prices of Islamic equity unit trust fund which measure by fund NAV and the selected macro-economic variables of Malaysian economy by using VECM causality test and Granger causality test. Monthly data has been used from Jan, 2006 to Dec, 2012 for all the variables. The findings of the study showed that industrial production index, political election and financial crisis are the only variables having unidirectional causal relationship with fund unit price. However, the global oil prices is having bidirectional causality with fund NAV. Thus, it is concluded that the equity unit trust fund industry in Malaysia is an inefficient market with respect to the industrial production index, global oil prices, political election and financial crisis. However, the market is approaching towards informational efficiency at least with respect to four macroeconomic variables, treasury bill rate, money supply, foreign exchange rate and corruption index.

Keywords: fund unit price, unit trust industry, Malaysia, macroeconomic variables, causality

Procedia PDF Downloads 470
25492 Application of Artificial Neural Network Technique for Diagnosing Asthma

Authors: Azadeh Bashiri

Abstract:

Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.

Keywords: asthma, data mining, Artificial Neural Network, intelligent system

Procedia PDF Downloads 274
25491 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 197
25490 Safe and Scalable Framework for Participation of Nodes in Smart Grid Networks in a P2P Exchange of Short-Term Products

Authors: Maciej Jedrzejczyk, Karolina Marzantowicz

Abstract:

Traditional utility value chain is being transformed during last few years into unbundled markets. Increased distributed generation of energy is one of considerable challenges faced by Smart Grid networks. New sources of energy introduce volatile demand response which has a considerable impact on traditional middlemen in E&U market. The purpose of this research is to search for ways to allow near-real-time electricity markets to transact with surplus energy based on accurate time synchronous measurements. A proposed framework evaluates the use of secure peer-2-peer (P2P) communication and distributed transaction ledgers to provide flat hierarchy, and allow real-time insights into present and forecasted grid operations, as well as state and health of the network. An objective is to achieve dynamic grid operations with more efficient resource usage, higher security of supply and longer grid infrastructure life cycle. Methods used for this study are based on comparative analysis of different distributed ledger technologies in terms of scalability, transaction performance, pluggability with external data sources, data transparency, privacy, end-to-end security and adaptability to various market topologies. An intended output of this research is a design of a framework for safer, more efficient and scalable Smart Grid network which is bridging a gap between traditional components of the energy network and individual energy producers. Results of this study are ready for detailed measurement testing, a likely follow-up in separate studies. New platforms for Smart Grid achieving measurable efficiencies will allow for development of new types of Grid KPI, multi-smart grid branches, markets, and businesses.

Keywords: autonomous agents, Distributed computing, distributed ledger technologies, large scale systems, micro grids, peer-to-peer networks, Self-organization, self-stabilization, smart grids

Procedia PDF Downloads 302
25489 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course

Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu

Abstract:

This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.

Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN

Procedia PDF Downloads 44
25488 Data Access, AI Intensity, and Scale Advantages

Authors: Chuping Lo

Abstract:

This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.

Keywords: digital intensity, digital divide, international trade, scale of economics

Procedia PDF Downloads 68
25487 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 412
25486 Identity Verification Using k-NN Classifiers and Autistic Genetic Data

Authors: Fuad M. Alkoot

Abstract:

DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN). 

Keywords: biometrics, genetic data, identity verification, k nearest neighbor

Procedia PDF Downloads 258
25485 Preparing Education Enter the ASEAN Community: The Case Study of Suan Sunandha Rajabhat University

Authors: Sakapas Saengchai, Vilasinee Jintalikhitdee, Mathinee Khongsatid, Nattapol Pourprasert

Abstract:

This paper studied the preparing education enter the ASEAN Community by the year 2015 the Ministry of Education has policy on ASEAN Charter, including the dissemination of information to create a good attitude about ASEAN, development of students' skills appropriately, development of educational standards to prepare for the liberalization of education in the region and Youth Development as a vital resource in advancing the ASEAN community. Preparing for the liberalization of education Commission on Higher Education (CHE) has prepared Thailand strategic to become ASEAN and support the free trade in higher education service; increasing graduate capability to reach international standards; strengthening higher educational institutions; and enhancing roles of educational institutions in the ASEAN community is main factor in set up long-term education frame 15 years, volume no. 2. As well as promoting Thailand as a center for education in the neighbor countries. As well as development data centers of higher education institutions in the region make the most of the short term plan is to supplement the curriculum in the ASEAN community. Moreover, provides a teaching of English and other languages used in the region, creating partnerships with the ASEAN countries to exchange academics staff and students, research, training, development of joint programs, and system tools in higher education.

Keywords: ASEAN community, education, institution, dissemination of information

Procedia PDF Downloads 472
25484 A Review on Intelligent Systems for Geoscience

Authors: R Palson Kennedy, P.Kiran Sai

Abstract:

This article introduces machine learning (ML) researchers to the hurdles that geoscience problems present, as well as the opportunities for improvement in both ML and geosciences. This article presents a review from the data life cycle perspective to meet that need. Numerous facets of geosciences present unique difficulties for the study of intelligent systems. Geosciences data is notoriously difficult to analyze since it is frequently unpredictable, intermittent, sparse, multi-resolution, and multi-scale. The first half addresses data science’s essential concepts and theoretical underpinnings, while the second section contains key themes and sharing experiences from current publications focused on each stage of the data life cycle. Finally, themes such as open science, smart data, and team science are considered.

Keywords: Data science, intelligent system, machine learning, big data, data life cycle, recent development, geo science

Procedia PDF Downloads 136
25483 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients' Cohorts: A Case Study in Scotland

Authors: Raptis Sotirios

Abstract:

Health and social care (HSc) services planning and scheduling are facing unprecedented challenges due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven can help to improve policies, plan and design services provision schedules using algorithms assist healthcare managers’ to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as CART, random forests (RF), and logistic regression (LGR). The significance tests Chi-Squared test and Student test are used on data over a 39 years span for which HSc services data exist for services delivered in Scotland. The demands are probabilistically associated through statistical hypotheses that assume that the target service’s demands are statistically dependent on other demands as a NULL hypothesis. This linkage can be confirmed or not by the data. Complementarily, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus groups of services. Statistical tests confirm ML couplings making the prediction also statistically meaningful and prove that a target service can be matched reliably to other services, and ML shows these indicated relationships can also be linear ones. Zero paddings were used for missing years records and illustrated better such relationships both for limited years and in the entire span offering long term data visualizations while limited years groups explained how well patients numbers can be related in short periods or can change over time as opposed to behaviors across more years. The prediction performance of the associations is measured using Receiver Operating Characteristic(ROC) AUC and ACC metrics as well as the statistical tests, Chi-Squared and Student. Co-plots and comparison tables for RF, CART, and LGR as well as p-values and Information Exchange(IE), are provided showing the specific behavior of the ML and of the statistical tests and the behavior using different learning ratios. The impact of k-NN and cross-correlation and C-Means first groupings is also studied over limited years and the entire span. It was found that CART was generally behind RF and LGR, but in some interesting cases, LGR reached an AUC=0 falling below CART, while the ACC was as high as 0.912, showing that ML methods can be confused padding or by data irregularities or outliers. On average, 3 linear predictors were sufficient, LGR was found competing RF well, and CART followed with the same performance at higher learning ratios. Services were packed only if when significance level(p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited years, across various services sectors, learning configurations, as confirmed using statistical hypotheses.

Keywords: class, cohorts, data frames, grouping, prediction, prob-ability, services

Procedia PDF Downloads 236
25482 Participation of Juvenile with Driven of Tobacco Control in Education Institute: Case Study of Suan Sunandha Rajabhat University

Authors: Sakapas Saengchai

Abstract:

This paper studied the participation of juvenile with driven of tobacco control in education institute: case study of Suan Sunandha Rajabhat University is qualitative research has objective to study participation of juvenile with driven of tobacco control in University, as guidance of development participation of juvenile with driven of tobacco control in education institute the university is also free-cigarette university. There are qualitative researches on collection data of participation observation, in-depth interview of group conversation and agent of student in each faculty and college and exchange opinion of student. Result of study found that participation in tobacco control has 3 parts; 1) Participation in campaign of tobacco control, 2) Academic training and activity of free-cigarette of university and 3) As model of juvenile in tobacco control. For guidelines on youth involvement in driven tobacco control is universities should promote tobacco control activities. Reduce smoking campaign continues include a specific area for smokers has living room as sign clearly, staying in the faculty / college and developing network of model students who are non-smoking. This is a key role in the coordination of university students driving to the free cigarette university. Including the strengthening of community in the area and outside the area as good social and quality of country.

Keywords: participation, juvenile, tobacco control, institute

Procedia PDF Downloads 274
25481 Engaged Employee: Re-Examine the Effects of Psychological Conditions on Employee Outcomes

Authors: Muncharee Phaobthip

Abstract:

In this research, the researcher re-examine the mediating effect of employee engagement between its antecedents and consequences for investigates the relation of leadership practices, employment branding and employee engagement based on social exchange theory. As such the researcher has four objectives as follows: First, to study the effects of leadership practices on employment branding, employee engagement and work intention; second, to examine the effects of employer brand perception on employee engagement and work intention; third, to examine the effects of employee engagement on work intention; and last, forth, the researcher inquires into the respondence of work intention. The researcher constituted a sample population of 535 employees of a Thai hotel chain located in four regions of the Kingdom of Thailand (Thailand). The researcher utilized a mixed-methods approach divided into quantitative and qualitative research investigatory phases, respectively. In the quantitative phase of research investigation, the researcher collected germane data from the 535 members of the sample population through the use of a questionnaire as a research instrument. In the qualitative phase of research investigation, relevant data were obtained through carrying out in-depth interviews with three subgroups of members of the sample population. These three subgroups consisted of twelve hotelier experts, six employees at the administrator level, and operational level employees. Focus group discussions were held with discussants from these three subgroups. Findings are as follows: Leadership practices showed positive effects on employment branding, employee engagement, and work intention. Employment branding displayed positive effects on employee engagement and work intention. Employee engagement had positive effects on work intention. However, in the analysis of the equation, the researcher confirmed that the important role of employee engagement is mediator factor between its antecedent and consequence factors. This provides benefits, in that it augments the body of knowledge devoted to the fostering of employee engagement in respect to psychological conditions. In conclusion, the researcher found that the value co-creation between leaders, employers and employees had positive effects on employee outcomes for lead to business outcomes according to reciprocal rule.

Keywords: antecedents, employee engagement, psychological conditions, work intention

Procedia PDF Downloads 113
25480 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 137
25479 Prediction of Boundary Shear Stress with Flood Plains Enlargements

Authors: Spandan Sahu, Amiya Kumar Pati, Kishanjit Kumar Khatua

Abstract:

The river is our main source of water which is a form of open channel flow and the flow in the open channel provides with many complex phenomena of sciences that need to be tackled such as the critical flow conditions, boundary shear stress, and depth-averaged velocity. The development of society, more or less solely depends upon the flow of rivers. The rivers are major sources of many sediments and specific ingredients which are much essential for human beings. During floods, part of a river is carried by the simple main channel and rest is carried by flood plains. For such compound asymmetric channels, the flow structure becomes complicated due to momentum exchange between the main channel and adjoining flood plains. Distribution of boundary shear in subsections provides us with the concept of momentum transfer between the interface of the main channel and the flood plains. Experimentally, to get better data with accurate results are very complex because of the complexity of the problem. Hence, CES software has been used to tackle the complex processes to determine the shear stresses at different sections of an open channel having asymmetric flood plains on both sides of the main channel, and the results are compared with the symmetric flood plains for various geometrical shapes and flow conditions. Error analysis is also performed to know the degree of accuracy of the model implemented.

Keywords: depth average velocity, non prismatic compound channel, relative flow depth, velocity distribution

Procedia PDF Downloads 177
25478 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 437
25477 Production of Human BMP-7 with Recombinant E. coli and B. subtilis

Authors: Jong Il Rhee

Abstract:

The polypeptide representing the mature part of human BMP-7 was cloned and efficiently expressed in Escherichia coli and Bacillus subtilis, which had a clear band for hBMP-7, a homodimeric protein with an apparent molecular weight of 15.4 kDa. Recombinant E.coli produced 111 pg hBMP-7/mg of protein hBMP-7 through IPTG induction. Recombinant B. subtilis also produced 350 pg hBMP-7/ml of culture medium. The hBMP-7 was purified in 2 steps using an FPLC system with an ion exchange column and a gel filtration column. The hBMP-7 produced in this work also stimulated the alkaline phosphatase (ALP) activity in a dose-dependent manner, i.e. 2.5- and 8.9-fold at 100 and 300 ng hBMP-7/ml, respectively, and showed intact biological activity.

Keywords: B. subtilis, E. coli, fermentation, hBMP-7

Procedia PDF Downloads 443
25476 Efficient Utilization of Biomass for Bioenergy in Environmental Control

Authors: Subir Kundu, Sukhendra Singh, Sumedha Ojha, Kanika Kundu

Abstract:

The continuous decline of petroleum and natural gas reserves and non linear rise of oil price has brought about a realisation of the need for a change in our perpetual dependence on the fossil fuel. A day to day increased consumption of crude and petroleum products has made a considerable impact on our foreign exchange reserves. Hence, an alternate resource for the conversion of energy (both liquid and gas) is essential for the substitution of conventional fuels. Biomass is the alternate solution for the present scenario. Biomass can be converted into both liquid as well as gaseous fuels and other feedstocks for the industries.

Keywords: bioenergy, biomass conversion, biorefining, efficient utilisation of night soil

Procedia PDF Downloads 407
25475 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: data augmentation, mutex task generation, meta-learning, text classification.

Procedia PDF Downloads 94