Search results for: Distributed Algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5394

Search results for: Distributed Algorithm

1734 Optimal Bayesian Chart for Controlling Expected Number of Defects in Production Processes

Authors: V. Makis, L. Jafari

Abstract:

In this paper, we develop an optimal Bayesian chart to control the expected number of defects per inspection unit in production processes with long production runs. We formulate this control problem in the optimal stopping framework. The objective is to determine the optimal stopping rule minimizing the long-run expected average cost per unit time considering partial information obtained from the process sampling at regular epochs. We prove the optimality of the control limit policy, i.e., the process is stopped and the search for assignable causes is initiated when the posterior probability that the process is out of control exceeds a control limit. An algorithm in the semi-Markov decision process framework is developed to calculate the optimal control limit and the corresponding average cost. Numerical examples are presented to illustrate the developed optimal control chart and to compare it with the traditional u-chart.

Keywords: Bayesian u-chart, economic design, optimal stopping, semi-Markov decision process, statistical process control

Procedia PDF Downloads 570
1733 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 363
1732 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization

Authors: Yihao Kuang, Bowen Ding

Abstract:

With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graphs and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improved strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain a better and more efficient inference effect by introducing PPO into knowledge inference technology.

Keywords: reinforcement learning, PPO, knowledge inference

Procedia PDF Downloads 235
1731 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum

Authors: Abdulrahman Sumayli, Saad M. AlShahrani

Abstract:

For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectively

Keywords: temperature, pressure variations, machine learning, oil treatment

Procedia PDF Downloads 65
1730 The Preparation and Training of Expert Studio Reviewers

Authors: Diane M. Bender

Abstract:

In design education, professional education is delivered in a studio, where students learn and understand their discipline. This learning methodology culminates in a final review, where students present their work before instructors and invited reviewers, known as jurors. These jurors are recognized experts who add a wide diversity of opinions in their feedback to students. This feedback can be provided in multiple formats, mainly a verbal critique of the work. To better understand how these expert reviewers prepare for a studio review, a survey was distributed to reviewers at a multi-disciplinary design school within the United States. Five design disciplines are involved in this case study: architecture, graphic design, industrial design, interior design, and landscape architecture. Respondents (n=122) provided information about if and how they received training on how to critique and participate in a final review. Common forms of training included mentorship, modeled behavior from other designers/past professors, workshops on critique from the instructing faculty prior to the crit session, and by being a practicing design professional. Respondents also gave feedback about how much the instructor provided course materials prior to the review in order to better prepare for student interaction. Finally, respondents indicated if they had interaction, and in what format, with students prior to the final review. Typical responses included participation in studio desk crits, a midterm jury member, meetings with students, and email or social media correspondence. While the focus of this study is the studio review, the findings are equally applicable to other disciplines. Suggestions will be provided on how to improve the preparation of guests in the learning process and how their interaction can positively influence student engagement.

Keywords: critique, design, education, evaluation, juror

Procedia PDF Downloads 77
1729 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application

Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro

Abstract:

This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.

Keywords: item response theory, dimensionality, submodel theory, factorial analysis

Procedia PDF Downloads 368
1728 The Application of Enzymes on Pharmaceutical Products and Process Development

Authors: Reginald Anyanwu

Abstract:

Enzymes are biological molecules that significantly regulate the rate of almost all of the chemical reactions that take place within cells, and have been widely used for products’ innovations. They are vital for life and serve a wide range of important functions in the body, such as aiding in digestion and metabolism. The present study was aimed at finding out the extent to which biological molecules have been utilized by pharmaceutical, food and beverage, and biofuel industries in commercial and scale up applications. Taking into account the escalating business opportunities in this vertical, biotech firms have also been penetrating enzymes industry especially that of food. The aim of the study therefore was to find out how biocatalysis can be successfully deployed; how enzyme application can improve industrial processes. To achieve the purpose of the study, the researcher focused on the analytical tools that are critical for the scale up implementation of enzyme immobilization to ascertain the extent of increased product yield at minimum logistical burden and maximum market profitability on the environment and user. The researcher collected data from four pharmaceutical companies located at Anambra state and Imo state of Nigeria. Questionnaire items were distributed to these companies. The researcher equally made a personal observation on the applicability of these biological molecules on innovative Products since there is now shifting trends toward the consumption of healthy and quality food. In conclusion, it was discovered that enzymes have been widely used for products’ innovations but there are however variations on their applications. It was also found out that pivotal contenders of enzymes market have lately been making heavy investments in the development of innovative product solutions. It was recommended that the applications of enzymes on innovative products should be widely practiced.

Keywords: enzymes, pharmaceuticals, process development, quality food consumption, scale-up applications

Procedia PDF Downloads 137
1727 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 320
1726 Protective Effect of Rosemary Extract against Toxicity Induced by Egyptian Naja haje Venom

Authors: Walaa H. Salama, Azza M. Abdel-Aty, Afaf S. Fahmy

Abstract:

Background: Egyptian Cobra; Naja haje (Elapidae) is one of most common snakes, widely distributed in Egypt and its envenomation causes multi-organ failure leading to rapid death. Thus, Different medicinal plants showed a protective effect against venom toxicity and may complement the conventional antivenom therapy. Aim: The present study was designed to assess both the antioxidant capacity of methanolic extract of rosemary leaves and evaluate the neutralizing ability of the extract against hepatotoxicity induced by Naja haje venom. Methods: The total phenolic and flavonoid contents and the antioxidant capacity of the methanolic rosemary extract were estimated by DPPH and ABTS Scavenging methods. In addition, the rosemary extract were assessed for anti-venom properties under in vitro and in vivo standard assays. Results: The rosemary extract had high total phenolic and flavonoid content as 12 ± 2 g of gallic acid equivalent per 100 gram of dry weight (g GAE/100g dw) and 5.5 ± 0.8 g of catechin equivalent per 100 grams of dry weight (g CE/100g dw), respectively. In addition, the rosemary extract showed high antioxidant capacity. Furthermore, The rosemary extract were inhibited in vitro the enzymatic activities of phospholipase A₂, L-amino acid oxidase, and hyaluronidase of the venom in a dose-dependent manner. Moreover, indirect hemolytic activity, hepatotoxicity induced by venom were completely neutralized as shown by histological studies. Conclusion: The phenolic compounds of rosemary extract with potential antioxidant activity may be considered as a promising candidate for future therapeutics in snakebite therapy.

Keywords: antioxidant activity, neutralization, phospholipase A₂ enzyme, snake venom

Procedia PDF Downloads 178
1725 Homogeneity and Trend Analyses of Temperature Indices: The Case Study of Umbria Region (Italy) in the Mediterranean Area

Authors: R. Morbidelli, C. Saltalippi, A. Flammini, A. Garcia-Marin, J. L. Ayuso-Munoz

Abstract:

The climate change, mainly due to greenhouse gas emissions associated to human activities, has been modifying hydrologic processes with a direct effect on air surface temperature that has significantly increased in the last century at global scale. In this context the Mediterranean area is considered to be particularly sensitive to the climate change impacts on temperature indices. An analysis finalized to study the evolution of temperature indices and to check the existence of significant trends in the Umbria Region (Italy) is presented. Temperature data were obtained by seven meteorological stations uniformly distributed in the study area and characterized by very long series of temperature observations (at least 60 years) spanning the 1924-2015 period. A set of 39 temperature indices represented by monthly and annual mean, average maximum and average minimum temperatures, has been derived. The trend analysis was realized by applying the non-parametric Mann-Kendall test, while the non-parametric Pettit test and the parametric Standard Normal Homogeneity test (SNHT) were used to check the presence of breakpoints or in-homogeneities due to environmental changes/anthropic activity or climate change effects. The Umbria region, in agreement with other recent studies exploring the temperature behavior in Italy, shows a general increase in all temperature indices, with the only exception of Gubbio site that exhibits very light negative trends or absence of trend. The presence of break points and in-homogeneity was widely explored through the selected tests and the results were checked on the basis of the well-known metadata of the meteorological stations.

Keywords: reception theory, reading, literary translation, horizons of expectation, reader

Procedia PDF Downloads 158
1724 Representativity Based Wasserstein Active Regression

Authors: Benjamin Bobbia, Matthias Picard

Abstract:

In recent years active learning methodologies based on the representativity of the data seems more promising to limit overfitting. The presented query methodology for regression using the Wasserstein distance measuring the representativity of our labelled dataset compared to the global distribution. In this work a crucial use of GroupSort Neural Networks is made therewith to draw a double advantage. The Wasserstein distance can be exactly expressed in terms of such neural networks. Moreover, one can provide explicit bounds for their size and depth together with rates of convergence. However, heterogeneity of the dataset is also considered by weighting the Wasserstein distance with the error of approximation at the previous step of active learning. Such an approach leads to a reduction of overfitting and high prediction performance after few steps of query. After having detailed the methodology and algorithm, an empirical study is presented in order to investigate the range of our hyperparameters. The performances of this method are compared, in terms of numbers of query needed, with other classical and recent query methods on several UCI datasets.

Keywords: active learning, Lipschitz regularization, neural networks, optimal transport, regression

Procedia PDF Downloads 79
1723 Double Encrypted Data Communication Using Cryptography and Steganography

Authors: Adine Barett, Jermel Watson, Anteneh Girma, Kacem Thabet

Abstract:

In information security, secure communication of data across networks has always been a problem at the forefront. Transfer of information across networks is susceptible to being exploited by attackers engaging in malicious activity. In this paper, we leverage steganography and cryptography to create a layered security solution to protect the information being transmitted. The first layer of security leverages crypto- graphic techniques to scramble the information so that it cannot be deciphered even if the steganography-based layer is compromised. The second layer of security relies on steganography to disguise the encrypted in- formation so that it cannot be seen. We consider three cryptographic cipher methods in the cryptography layer, namely, Playfair cipher, Blowfish cipher, and Hills cipher. Then, the encrypted message is passed through the least significant bit (LSB) to the steganography algorithm for further encryption. Both encryption approaches are combined efficiently to help secure information in transit over a network. This multi-layered encryption is a solution that will benefit cloud platforms, social media platforms and networks that regularly transfer private information such as banks and insurance companies.

Keywords: cryptography, steganography, layered security, Cipher, encryption

Procedia PDF Downloads 77
1722 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction

Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho

Abstract:

Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.

Keywords: computed tomography, computed laminography, compressive sending, low-dose

Procedia PDF Downloads 460
1721 The Direct Drivers of Ethnocentric Consumer, Intention and Actual Purchasing Behavior in Malaysia

Authors: Nik Kamariah Nikmat, Noor Hasmini Abdghani

Abstract:

The Malaysian government had consistently revived its campaign for “Buy Malaysian Goods” from time to time. The purpose of the campaign is to remind consumers to be ethnocentric and patriotic when purchasing product and services. This is necessary to ensure high demand for local products and services compared to foreign products. However, the decline of domestic investment in 2012 has triggered concern for the Malaysian economy. Hence, this study attempts to determine the drivers of actual purchasing behavior, intention to purchase domestic products and ethnocentrism. The study employs the cross-sectional primary data, self-administered on household, selected using stratified random sampling in four Malaysian regions. A nine factor driver of actual domestic purchasing behavior (culture openness, conservatism, collectivism, patriotism, control belief, interest in foreign travel, attitude, ethnocentrism and intention) were measured utilizing 60 items, using 7-point Likert-scale. From 1000 questionnaires distributed, a sample of 486 were returned representing 48.6 percent response rate. From the fit generated structural model (SEM analysis), it was found that the drivers of actual purchase behavior are collectivism, cultural openness and patriotism; the drivers of intention to purchase domestic product are attitude, control belief, collectivism and conservativeness; and drivers of ethnocentrism are cultural openness, control belief, foreign travel and patriotism. It also shows that Malaysian consumers scored high in ethnocentrism and patriotism. The findings are discussed in the perspective of its implication to Malaysian National Agenda.

Keywords: actual purchase, ethnocentrism, patriotism, culture openness, conservatism

Procedia PDF Downloads 312
1720 Component Based Testing Using Clustering and Support Vector Machine

Authors: Iqbaldeep Kaur, Amarjeet Kaur

Abstract:

Software Reusability is important part of software development. So component based software development in case of software testing has gained a lot of practical importance in the field of software engineering from academic researcher and also from software development industry perspective. Finding test cases for efficient reuse of test cases is one of the important problems aimed by researcher. Clustering reduce the search space, reuse test cases by grouping similar entities according to requirements ensuring reduced time complexity as it reduce the search time for retrieval the test cases. In this research paper we proposed approach for re-usability of test cases by unsupervised approach. In unsupervised learning we proposed k-mean and Support Vector Machine. We have designed the algorithm for requirement and test case document clustering according to its tf-idf vector space and the output is set of highly cohesive pattern groups.

Keywords: software testing, reusability, clustering, k-mean, SVM

Procedia PDF Downloads 427
1719 Constructing White-Box Implementations Based on Threshold Shares and Composite Fields

Authors: Tingting Lin, Manfred von Willich, Dafu Lou, Phil Eisen

Abstract:

A white-box implementation of a cryptographic algorithm is a software implementation intended to resist extraction of the secret key by an adversary. To date, most of the white-box techniques are used to protect block cipher implementations. However, a large proportion of the white-box implementations are proven to be vulnerable to affine equivalence attacks and other algebraic attacks, as well as differential computation analysis (DCA). In this paper, we identify a class of block ciphers for which we propose a method of constructing white-box implementations. Our method is based on threshold implementations and operations in composite fields. The resulting implementations consist of lookup tables and few exclusive OR operations. All intermediate values (inputs and outputs of the lookup tables) are masked. The threshold implementation makes the distribution of the masked values uniform and independent of the original inputs, and the operations in composite fields reduce the size of the lookup tables. The white-box implementations can provide resistance against algebraic attacks and DCA-like attacks.

Keywords: white-box, block cipher, composite field, threshold implementation

Procedia PDF Downloads 163
1718 Cement Bond Characteristics of Artificially Fabricated Sandstones

Authors: Ashirgul Kozhagulova, Ainash Shabdirova, Galym Tokazhanov, Minh Nguyen

Abstract:

The synthetic rocks have been advantageous over the natural rocks in terms of availability and the consistent studying the impact of a particular parameter. The artificial rocks can be fabricated using variety of techniques such as mixing sand and Portland cement or gypsum, firing the mixture of sand and fine powder of borosilicate glass or by in-situ precipitation of calcite solution. In this study, sodium silicate solution has been used as the cementing agent for the quartz sand. The molded soft cylindrical sandstone samples are placed in the gas-tight pressure vessel, where the hardening of the material takes place as the chemical reaction between carbon dioxide and the silicate solution progresses. The vessel allows uniform disperse of carbon dioxide and control over the ambient gas pressure. Current paper shows how the bonding material is initially distributed in the intergranular space and the surface of the sand particles by the usage of Electron Microscopy and the Energy Dispersive Spectroscopy. During the study, the strength of the cement bond as a function of temperature is observed. The impact of cementing agent dosage on the micro and macro characteristics of the sandstone is investigated. The analysis of the cement bond at micro level helps to trace the changes to particles bonding damage after a potential yielding. Shearing behavior and compressional response have been examined resulting in the estimation of the shearing resistance and cohesion force of the sandstone. These are considered to be main input values to the mathematical prediction models of sand production from weak clastic oil reservoir formations.

Keywords: artificial sanstone, cement bond, microstructure, SEM, triaxial shearing

Procedia PDF Downloads 164
1717 Model of Transhipment and Routing Applied to the Cargo Sector in Small and Medium Enterprises of Bogotá, Colombia

Authors: Oscar Javier Herrera Ochoa, Ivan Dario Romero Fonseca

Abstract:

This paper presents a design of a model for planning the distribution logistics operation. The significance of this work relies on the applicability of this fact to the analysis of small and medium enterprises (SMEs) of dry freight in Bogotá. Two stages constitute this implementation: the first one is the place where optimal planning is achieved through a hybrid model developed with mixed integer programming, which considers the transhipment operation based on a combined load allocation model as a classic transshipment model; the second one is the specific routing of that operation through the heuristics of Clark and Wright. As a result, an integral model is obtained to carry out the step by step planning of the distribution of dry freight for SMEs in Bogotá. In this manner, optimum assignments are established by utilizing transshipment centers with that purpose of determining the specific routing based on the shortest distance traveled.

Keywords: transshipment model, mixed integer programming, saving algorithm, dry freight transportation

Procedia PDF Downloads 225
1716 Profit-Based Artificial Neural Network (ANN) Trained by Migrating Birds Optimization: A Case Study in Credit Card Fraud Detection

Authors: Ashkan Zakaryazad, Ekrem Duman

Abstract:

A typical classification technique ranks the instances in a data set according to the likelihood of belonging to one (positive) class. A credit card (CC) fraud detection model ranks the transactions in terms of probability of being fraud. In fact, this approach is often criticized, because firms do not care about fraud probability but about the profitability or costliness of detecting a fraudulent transaction. The key contribution in this study is to focus on the profit maximization in the model building step. The artificial neural network proposed in this study works based on profit maximization instead of minimizing the error of prediction. Moreover, some studies have shown that the back propagation algorithm, similar to other gradient–based algorithms, usually gets trapped in local optima and swarm-based algorithms are more successful in this respect. In this study, we train our profit maximization ANN using the Migrating Birds optimization (MBO) which is introduced to literature recently.

Keywords: neural network, profit-based neural network, sum of squared errors (SSE), MBO, gradient descent

Procedia PDF Downloads 470
1715 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition

Authors: L. Hamsaveni, Navya Prakash, Suresha

Abstract:

Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.

Keywords: grayscale image format, image fusing, RGB image format, SURF detection, YCbCr image format

Procedia PDF Downloads 372
1714 High Secure Data Hiding Using Cropping Image and Least Significant Bit Steganography

Authors: Khalid A. Al-Afandy, El-Sayyed El-Rabaie, Osama Salah, Ahmed El-Mhalaway

Abstract:

This paper presents a high secure data hiding technique using image cropping and Least Significant Bit (LSB) steganography. The predefined certain secret coordinate crops will be extracted from the cover image. The secret text message will be divided into sections. These sections quantity is equal the image crops quantity. Each section from the secret text message will embed into an image crop with a secret sequence using LSB technique. The embedding is done using the cover image color channels. Stego image is given by reassembling the image and the stego crops. The results of the technique will be compared to the other state of art techniques. Evaluation is based on visualization to detect any degradation of stego image, the difficulty of extracting the embedded data by any unauthorized viewer, Peak Signal-to-Noise Ratio of stego image (PSNR), and the embedding algorithm CPU time. Experimental results ensure that the proposed technique is more secure compared with the other traditional techniques.

Keywords: steganography, stego, LSB, crop

Procedia PDF Downloads 265
1713 A Preliminary Investigation on Factors that Influence Road Users Speeding Behaviors in Selected Roads of Peninsular Malaysia

Authors: Farah Fazlinda Binti Mohamad, Siti Hikmah Binti Musthar, Ahmad Saifizul Bin Abdullah, Jamilah Mohamad, Mohamed Rehan Karim

Abstract:

Road safety is intolerable issue. It affects and impinges on everyone's life as the roads shared by everyone. The most vulnerable victims were the road users who cater the roads every day. It is an appalling when World Health Organization reported that Malaysian road users were ranked worst in Asian countries with 23 deaths for every 100,000 of population over the span of 12 years (World Health Organization, 2009). From this report, it is found that speeding has contributed to 60% of all accidents in the country. Therefore, this study aims to elucidate on speeding matter that occur among road users in selected roads of Peninsular Malaysia. This study on the other hand, provides an insight understanding on the factors affecting behaviour of road users to speeding in selected roads of Peninsular Malaysia. To answer the study aims, 500 sets of questionnaires were distributed among 500 respondents in selected roads of Peninsular Malaysia to obtain their opinions on the matter. The respondents were from different demographics backgrounds to have fair explanation on the issue. The answers have been analysed using descriptive analysis. The results indicated psychological factors of road users appeared to be prominent in explaining road users’ behaviour to speeding. Male road users were also found dominant in speeding compared to female. Thus, this has increased their vulnerability to road injuries and deaths. These findings are very useful in order for us to improve our driving behaviour. Relevant authorities should also revise the existing countermeasures as well as designing the new countermeasures for the road users. It is nevertheless important to comprehend this speeding issue and factors associating it. This matter should be taken seriously and responsibly by each road users as road safety is a responsible of all.

Keywords: road safety, speeding, countermeasures, accidents

Procedia PDF Downloads 483
1712 Ghrelin, Obestatin and Ghrelin/Obestatin Ratio: A Postprandial Study in Healthy Subjects of Normal Weight

Authors: Panagiotis T. Kanellos, Vaios T. Karathanos, Andriana C. Kaliora

Abstract:

Introduction: The role of ghrelin and obestatin in appetite regulation has been investigated. However, data on ghrelin and obestatin changes after food ingestion are negligible. Objective: We aimed at assessing the appetite-regulating hormones, ghrelin, and obestatin, and furthermore calculate ghrelin/obestatin ratio in healthy normal-weight subjects after consumption of raisins. This survey is a comparative study of a glucose control with raisins containing fructose and glucose in similar concentrations as well as fibers. Methodology: Ten apparently healthy subjects who reported no history of glucose intolerance, diabetes, gastrointestinal disorders, or recent use of any antibiotics were enrolled in the study. The raisins used (Vitis vinifera) originate in Greece and are distributed worldwide as Corinthian raisins. In a randomized crossover design, all subjects after an overnight fast consumed, either 50g of glucose diluted in 240 mL of water (control) or 74 g of raisins (sugar content 50 g) with a 5-day interval between individual trials. Vein blood samples were collected at baseline and at 60, 120 and 180 min postprandially. In blood samples ghrelin and obestatin were measured applying specific enzyme linked immuno absorbent assays. Results: The subjects were of mean age 26.3 years, with BMI of 21.6 kg/m2, waist circumference of 77.7 cm, normal serum lipidemic parameters and normal HbA1c levels. Ghrelin levels were significantly lower after raisin consumption compared to glucose at 120 and at 180 min post-ingestion (p= 0.011 and p= 0.035, respectively). However, obestatin did not reach statistical significance between the two interventions. The ghrelin/obestatin ratio was found significantly lower (p=0.020) at 120 min after raisin ingestion compared to control. Conclusion: Two isocaloric foods containing equal amounts of sugars, however with a different composition, have different effects on appetite hormones ghrelin and obestatin in normal-weight healthy subjects.

Keywords: appetite, ghrelin, obestatin, raisins

Procedia PDF Downloads 394
1711 Identification of Risks Associated with Process Automation Systems

Authors: J. K. Visser, H. T. Malan

Abstract:

A need exists to identify the sources of risks associated with the process automation systems within petrochemical companies or similar energy related industries. These companies use many different process automation technologies in its value chain. A crucial part of the process automation system is the information technology component featuring in the supervisory control layer. The ever-changing technology within the process automation layers and the rate at which it advances pose a risk to safe and predictable automation system performance. The age of the automation equipment also provides challenges to the operations and maintenance managers of the plant due to obsolescence and unavailability of spare parts. The main objective of this research was to determine the risk sources associated with the equipment that is part of the process automation systems. A secondary objective was to establish whether technology managers and technicians were aware of the risks and share the same viewpoint on the importance of the risks associated with automation systems. A conceptual model for risk sources of automation systems was formulated from models and frameworks in literature. This model comprised six categories of risk which forms the basis for identifying specific risks. This model was used to develop a questionnaire that was sent to 172 instrument technicians and technology managers in the company to obtain primary data. 75 completed and useful responses were received. These responses were analyzed statistically to determine the highest risk sources and to determine whether there was difference in opinion between technology managers and technicians. The most important risks that were revealed in this study are: 1) the lack of skilled technicians, 2) integration capability of third-party system software, 3) reliability of the process automation hardware, 4) excessive costs pertaining to performing maintenance and migrations on process automation systems, and 5) requirements of having third-party communication interfacing compatibility as well as real-time communication networks.

Keywords: distributed control system, identification of risks, information technology, process automation system

Procedia PDF Downloads 134
1710 Syllogistic Reasoning with 108 Inference Rules While Case Quantities Change

Authors: Mikhail Zarechnev, Bora I. Kumova

Abstract:

A syllogism is a deductive inference scheme used to derive a conclusion from a set of premises. In a categorical syllogisms, there are only two premises and every premise and conclusion is given in form of a quantified relationship between two objects. The different order of objects in premises give classification known as figures. We have shown that the ordered combinations of 3 generalized quantifiers with certain figure provide in total of 108 syllogistic moods which can be considered as different inference rules. The classical syllogistic system allows to model human thought and reasoning with syllogistic structures always attracted the attention of cognitive scientists. Since automated reasoning is considered as part of learning subsystem of AI agents, syllogistic system can be applied for this approach. Another application of syllogistic system is related to inference mechanisms on the Semantic Web applications. In this paper we proposed the mathematical model and algorithm for syllogistic reasoning. Also the model of iterative syllogistic reasoning in case of continuous flows of incoming data based on case–based reasoning and possible applications of proposed system were discussed.

Keywords: categorical syllogism, case-based reasoning, cognitive architecture, inference on the semantic web, syllogistic reasoning

Procedia PDF Downloads 409
1709 A Study Problem and Needs Compare the Held of the Garment Industries in Nonthaburi and Bangkok Area

Authors: Thepnarintra Praphanphat

Abstract:

The purposes of this study were to investigate garment industry’s condition, problems, and need for assistance. The population of the study was 504 managers or managing directors of garment establishments finished apparel industrial manager and permission of the Department of Industrial Works 28, Ministry of Industry until January 1, 2012. In determining the sample size with the opening of the Taro Yamane finished at 95% confidence level is ± 5% deviation was 224 managers. Questionnaires were used to collect the data. Percentage, frequency, arithmetic mean, standard deviation, t-test, ANOVA, and LSD were used to analyze the data. It was found that most establishments were of a large size, operated in a form of limited company for more than 15 years most of which produced garments for working women. All investment was made by Thai people. The products were made to order and distributed domestically and internationally. The total sale of the year 2010, 2011, and 2012 was almost the same. With respect to the problems of operating the business, the study indicated, as a whole, by- aspects, and by-items, that they were at a high level. The comparison of the level of problems of operating garment business as classified by general condition showed that problems occurring in business of different sizes were, as a whole, not different. In taking aspects into consideration, it was found that the level of problem in relation to production was different; medium establishments had more problems in production than those of small and large sizes. According to the by-items analysis, five problems were found different; namely, problems concerning employees, machine maintenance, number of designers, and price competition. Such problems in the medium establishments were at a higher level than those in the small and large establishments. Regarding business age, the examination yielded no differences as a whole, by-aspects, and by-items. The statistical significance level of this study was set at .05.

Keywords: garment industry, garment, fashion, competitive enhancement project

Procedia PDF Downloads 186
1708 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 391
1707 Low Density Parity Check Codes

Authors: Kassoul Ilyes

Abstract:

The field of error correcting codes has been revolutionized by the introduction of iteratively decoded codes. Among these, LDPC codes are now a preferred solution thanks to their remarkable performance and low complexity. The binary version of LDPC codes showed even better performance, although it’s decoding introduced greater complexity. This thesis studies the performance of binary LDPC codes using simplified weighted decisions. Information is transported between a transmitter and a receiver by digital transmission systems, either by propagating over a radio channel or also by using a transmission medium such as the transmission line. The purpose of the transmission system is then to carry the information from the transmitter to the receiver as reliably as possible. These codes have not generated enough interest within the coding theory community. This forgetfulness will last until the introduction of Turbo-codes and the iterative principle. Then it was proposed to adopt Pearl's Belief Propagation (BP) algorithm for decoding these codes. Subsequently, Luby introduced irregular LDPC codes characterized by a parity check matrix. And finally, we study simplifications on binary LDPC codes. Thus, we propose a method to make the exact calculation of the APP simpler. This method leads to simplifying the implementation of the system.

Keywords: LDPC, parity check matrix, 5G, BER, SNR

Procedia PDF Downloads 152
1706 Optimized Dye-Sensitized Solar Cell Using Natural Dye and Counter Electrode from Robusta Coffee Beans Peel Waste

Authors: Tomi Setiawan, Wahyu Y. Subekti, Siti S. Nur'Adya, Khusnul Ilmiah

Abstract:

Dye-Sensitized Solar Cell (DSSC) is one type of solar cell, where solar cells function to convert light energy become the electrical energy. DSSC has two important parts of dye and counter electrode. Anthocyanin compounds in the coffee beans peel can be potential as natural dye and also counter electrodes as activated carbon in the DSSC system. The purpose of this research is to find out how to isolate Anthocyanin, manufacture of counter electrode, and to know the efficiency of counter electrode produced from the coffee pulp waste in DSSC prototype. In this research we used 2 x 2 cm FTO glass coated carbon paste with a thickness variation of 100 μL, 200 μL and 300 μL as counter electrode and other FTO glass coated with TiO₂ paste as work electrode, then two FTO glasses are connected to form a sandwich-liked structure and add Triiodide electrolyte solution in its gap, thus forming a DSSC prototype. The results showed that coffee pulp waste contains anthocyanin of 12.23 mL/80gr and it can produce activated carbon. The characterization performed shows that the UV-Vis Anthocyanin result is at wavelength of ultra violet area that is 219,50 nm with absorbance value equal to 1,469, and maximum wavelength at visible area is 720,00 nm with absorbance value equal to 0,013. The functional groups contained in the anthocyanin are O-H groups at wave numbers 3385.60 cm⁻¹, C = O groups at wave numbers 1618.63 cm⁻¹, and C-O-C groups at 1065.40 cm⁻¹ wave numbers. Morphological characterization using the SEM shows the activated carbon surface area becomes larger and evenly distributed. Voltage obtained on Counter Electrode 100 μL variation of 395mV, 200 μL of 334mV 100 μL of 254mV.

Keywords: DSSC, anthocyanin, counter electrode, solar cell, coffee pulp

Procedia PDF Downloads 181
1705 Initial Dip: An Early Indicator of Neural Activity in Functional Near Infrared Spectroscopy Waveform

Authors: Mannan Malik Muhammad Naeem, Jeong Myung Yung

Abstract:

Functional near infrared spectroscopy (fNIRS) has a favorable position in non-invasive brain imaging techniques. The concentration change of oxygenated hemoglobin and de-oxygenated hemoglobin during particular cognitive activity is the basis for this neuro-imaging modality. Two wavelengths of near-infrared light can be used with modified Beer-Lambert law to explain the indirect status of neuronal activity inside brain. The temporal resolution of fNIRS is very good for real-time brain computer-interface applications. The portability, low cost and an acceptable temporal resolution of fNIRS put it on a better position in neuro-imaging modalities. In this study, an optimization model for impulse response function has been used to estimate/predict initial dip using fNIRS data. In addition, the activity strength parameter related to motor based cognitive task has been analyzed. We found an initial dip that remains around 200-300 millisecond and better localize neural activity.

Keywords: fNIRS, brain-computer interface, optimization algorithm, adaptive signal processing

Procedia PDF Downloads 216