Search results for: Content- Based Recommendation
10090 A New Quantile Based Fuzzy Time Series Forecasting Model
Authors: Tahseen A. Jilani, Aqil S. Burney, C. Ardil
Abstract:
Time series models have been used to make predictions of academic enrollments, weather, road accident, casualties and stock prices, etc. Based on the concepts of quartile regression models, we have developed a simple time variant quantile based fuzzy time series forecasting method. The proposed method bases the forecast using prediction of future trend of the data. In place of actual quantiles of the data at each point, we have converted the statistical concept into fuzzy concept by using fuzzy quantiles using fuzzy membership function ensemble. We have given a fuzzy metric to use the trend forecast and calculate the future value. The proposed model is applied for TAIFEX forecasting. It is shown that proposed method work best as compared to other models when compared with respect to model complexity and forecasting accuracy.
Keywords: Quantile Regression, Fuzzy time series, fuzzy logicalrelationship groups, heuristic trend prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199910089 Surrogate based Evolutionary Algorithm for Design Optimization
Authors: Maumita Bhattacharya
Abstract:
Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.Keywords: Evolutionary algorithm, Fitness function, Optimization, Meta-model, Stochastic method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157710088 Multi-Scale Gabor Feature Based Eye Localization
Authors: Sanghoon Kim, Sun-Tae Chung, Souhwan Jung, Dusik Oh, Jaemin Kim, Seongwon Cho
Abstract:
Eye localization is necessary for face recognition and related application areas. Most of eye localization algorithms reported so far still need to be improved about precision and computational time for successful applications. In this paper, we propose an eye location method based on multi-scale Gabor feature vectors, which is more robust with respect to initial points. The eye localization based on Gabor feature vectors first needs to constructs an Eye Model Bunch for each eye (left or right eye) which consists of n Gabor jets and average eye coordinates of each eyes obtained from n model face images, and then tries to localize eyes in an incoming face image by utilizing the fact that the true eye coordinates is most likely to be very close to the position where the Gabor jet will have the best Gabor jet similarity matching with a Gabor jet in the Eye Model Bunch. Similar ideas have been already proposed in such as EBGM (Elastic Bunch Graph Matching). However, the method used in EBGM is known to be not robust with respect to initial values and may need extensive search range for achieving the required performance, but extensive search ranges will cause much more computational burden. In this paper, we propose a multi-scale approach with a little increased computational burden where one first tries to localize eyes based on Gabor feature vectors in a coarse face image obtained from down sampling of the original face image, and then localize eyes based on Gabor feature vectors in the original resolution face image by using the eye coordinates localized in the coarse scaled image as initial points. Several experiments and comparisons with other eye localization methods reported in the other papers show the efficiency of our proposed method.Keywords: Eye Localization, Gabor features, Multi-scale, Gabor wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182110087 Rule-Based Fuzzy Logic Controller with Adaptable Reference
Authors: Sheroz Khan, I. Adam, A. H. M. Zahirul Alam, Mohd Rafiqul Islam, Othman O. Khalifa
Abstract:
This paper attempts to model and design a simple fuzzy logic controller with Variable Reference. The Variable Reference (VR) is featured as an adaptability element which is obtained from two known variables – desired system-input and actual system-output. A simple fuzzy rule-based technique is simulated to show how the actual system-input is gradually tuned in to a value that closely matches the desired input. The designed controller is implemented and verified on a simple heater which is controlled by PIC Microcontroller harnessed by a code developed in embedded C. The output response of the PIC-controlled heater is analyzed and compared to the performances by conventional fuzzy logic controllers. The novelty of this work lies in the fact that it gives better performance by using less number of rules compared to conventional fuzzy logic controllers.Keywords: Fuzzy logic controller, Variable reference, Adaptability, Rule-based.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131310086 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method
Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay
Abstract:
This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.Keywords: Agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 107210085 Design of Compliant Mechanism Based Microgripper with Three Finger Using Topology Optimization
Authors: R. Bharanidaran, B. T. Ramesh
Abstract:
High precision in motion is required to manipulate the micro objects in precision industries for micro assembly, cell manipulation etc. Precision manipulation is achieved based on the appropriate mechanism design of micro devices such as microgrippers. Design of a compliant based mechanism is the better option to achieve a highly precised and controlled motion. This research article highlights the method of designing a compliant based three fingered microgripper suitable for holding asymmetric objects. Topological optimization technique, a systematic method is implemented in this research work to arrive a topologically optimized design of the mechanism needed to perform the required micro motion of the gripper. Optimization technique has a drawback of generating senseless regions such as node to node connectivity and staircase effect at the boundaries. Hence, it is required to have post processing of the design to make it manufacturable. To reduce the effect of post processing stage and to preserve the edges of the image, a cubic spline interpolation technique is introduced in the MATLAB program. Structural performance of the topologically developed mechanism design is tested using finite element method (FEM) software. Further the microgripper structure is examined to find its fatigue life and vibration characteristics.
Keywords: Compliant mechanism, Cubic spline interpolation, FEM, Topology optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 358010084 Preliminary Study of Fermented Pickle of Tabah Bamboo Shoot (Gigantochloa nigrociliata (Buese) Kurz)
Authors: Luh Putu T. Darmayanti, A. A. Duwipayana, I Nengah K. Putra, Nyoman S. Antara
Abstract:
Processing tabah bamboo shoot as fermented pickle is one of the way to increase the shelf life of this bamboo shoot. The advantage of this shoot is low concentration of hydro cyanic acid (HCN) make it potential for functional food product. This study aimed to determine the characteristic of tabah bamboo shoot pickle such as total of lactic acid bacteria (LAB), pH, total acidity, and hydro cyanic acid (HCN) content, and also find the LAB’s type involved during fermentation, and organic acids’ profiles. The pickle was made by natural fermentation with 6% salt concentration and fermentation conducted for 13 days. The result showed during the fermentation time, in the 4th day LAB’s number was highest as much as 72 x 107 CFU/ml and the lowest pH was 3.09. We also found decreasing in HCN from 37.8 ppm at the beginning to 20.52 ppm at the end of fermentation process. The organic acids detected during the fermentation were lactic acid with the highest concentration was 0.0546 g/100 g and small amount of acetic acid. By using PCR method, the 18 of LABs which had rod shape were detected as member of Lactobacillus spp., in which 17 strains detected as L. plantarum.
Keywords: Fermentation, LAB, pickle, tabah bamboo shoot.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 270210083 Cementing Efficiency of Low Calcium Fly Ash in Fly Ash Concretes
Authors: T. D. Gunneswara Rao, Mudimby Andal
Abstract:
Research on the utilization of fly ash will no longer refer the fly ash as a waste material of thermal power plants. Use of fly ash in concrete making, makes the concrete economical as well as durable. The fly ash is being added to the concrete in three ways namely, as partial replacement to cement, as partial replacement to fine aggregates and as admixture. Addition of fly ash to the concrete in any one of the form mentioned above, makes the concrete more workable and durable than the conventional concrete. Studies on fly ash as partial replacement to cement gained momentum as such replacement makes the concrete economical. In the present study, an attempt has been made to understand the effects of fly ash on the workability characteristics and strength aspects of fly ash concretes. In India major number of thermal power plants is producing low calcium fly ash. Hence in the present investigation low calcium fly ash has been used. Fly ash in concrete was considered for the partial replacement of cement. The percentage replacement of cement by fly ash varied from 0% to 40% at regular intervals of 10%. More over the fine aggregate to coarse aggregate ratio also has been varied as 1:1, 1:2 and 1:3. The workability tests revealed that up to 30% replacement of cement by fly ash in concrete mixes water demand for reduces, beyond 30% replacement of cement by fly ash demanded more water content for constant workability.
Keywords: Cementing Efficiency, Compressive Strength, Low Calcium Fly Ash, Workability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 252510082 A Type-2 Fuzzy Adaptive Controller of a Class of Nonlinear System
Authors: A. El Ougli, I. Lagrat, I. Boumhidi
Abstract:
In this paper we propose a robust adaptive fuzzy controller for a class of nonlinear system with unknown dynamic. The method is based on type-2 fuzzy logic system to approximate unknown non-linear function. The design of the on-line adaptive scheme of the proposed controller is based on Lyapunov technique. Simulation results are given to illustrate the effectiveness of the proposed approach.Keywords: Fuzzy set type-2, Adaptive fuzzy control, Nonlinear system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182910081 Opportunistic Routing with Secure Coded Wireless Multicast Using MAS Approach
Authors: E. Golden Julie, S. Tamil Selvi, Y. Harold Robinson
Abstract:
Many Wireless Sensor Network (WSN) applications necessitate secure multicast services for the purpose of broadcasting delay sensitive data like video files and live telecast at fixed time-slot. This work provides a novel method to deal with end-to-end delay and drop rate of packets. Opportunistic Routing chooses a link based on the maximum probability of packet delivery ratio. Null Key Generation helps in authenticating packets to the receiver. Markov Decision Process based Adaptive Scheduling algorithm determines the time slot for packet transmission. Both theoretical analysis and simulation results show that the proposed protocol ensures better performance in terms of packet delivery ratio, average end-to-end delay and normalized routing overhead.
Keywords: Delay-sensitive data, Markovian Decision Process based Adaptive Scheduling, Opportunistic Routing, Digital Signature authentication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195710080 A Neural Computing-Based Approach for the Early Detection of Hepatocellular Carcinoma
Authors: Marina Gorunescu, Florin Gorunescu, Kenneth Revett
Abstract:
Hepatocellular carcinoma, also called hepatoma, most commonly appears in a patient with chronic viral hepatitis. In patients with a higher suspicion of HCC, such as small or subtle rising of serum enzymes levels, the best method of diagnosis involves a CT scan of the abdomen, but only at high cost. The aim of this study was to increase the ability of the physician to early detect HCC, using a probabilistic neural network-based approach, in order to save time and hospital resources.Keywords: Early HCC diagnosis, probabilistic neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 126310079 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media
Authors: Jinghui Peng, Shanyu Tang, Jia Li
Abstract:
Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.Keywords: Steganalysis, security, fast Fourier transform, streaming media.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78310078 Space Vector Pulse Width Modulation Technique Based Design and Simulation of a Three-Phase Voltage Source Converter Systems
Authors: Farhan Beg
Abstract:
A Space Vector based Pulse Width Modulation control technique for the three-phase PWM converter is proposed in this paper. The proposed control scheme is based on a synchronous reference frame model. High performance and efficiency is obtained with regards to the DC bus voltage and the power factor considerations of the PWM rectifier thus leading to low losses. MATLAB/SIMULINK are used as a platform for the simulations and a SIMULINK model is presented in the paper. The results show that the proposed model demonstrates better performance and properties compared to the traditional SPWM method and the method improves the dynamic performance of the closed loop drastically. For the Space Vector based Pulse Width Modulation, Sine signal is the reference waveform and triangle waveform is the carrier waveform. When the value sine signal is large than triangle signal, the pulse will start produce to high. And then when the triangular signals higher than sine signal, the pulse will come to low. SPWM output will changed by changing the value of the modulation index and frequency used in this system to produce more pulse width. The more pulse width produced, the output voltage will have lower harmonics contents and the resolution increase.
Keywords: Power Factor, SVPWM, PWM rectifier, SPWM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 402310077 Effects of Microwave Heating on Biogas Production, Chemical Oxygen Demand and Volatile Solids Solubilization of Food Residues
Authors: Ackmez Mudhoo, Pravish Rye Moorateeah, Romeela Mohee
Abstract:
This paper presents the results of the preliminary investigation of microwave (MW) irradiation pretreatments on the anaerobic digestion of food residues using biochemical methane potential (BMP) assays. Low solids systems with a total solids (TS) content ranging from 5.0-10.0% were analyzed. The inoculum to bulk mass of substrates to water ratio was 1:2:2 (mass basis). The experimental conditions for pretreatments were as follows: a control (no MW irradiation), two runs with MW irradiation for 15 and 30 minutes at 320 W, and another two runs with MW irradiation at 528 W for 30 and 60 minutes. The cumulative biogas production were 6.3 L and 8.7 L for 15min/320 W and 30min/320 W MW irradiation conditions, respectively, and 10.5 L and 11.4 L biogas for 30min/528 W and 60min/528 W, respectively, as compared to the control giving 5.8 L biogas. Both an increase in exposure time of irradiation and power of MW had increased the rate and yield of biogas. Singlefactor ANOVA tests (p<0.05) indicated that the variations in VS, TS, COD and cumulative biogas generation were significantly different for the pretreatment conditions. Results from this study indicated that MW irradiation had enhanced the biogas production and degradation of total solids with a significant improvement in VS and COD solubilization.
Keywords: microwave irradiation, pretreatment, anaerobic digestion, food residues.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 243310076 An Auxiliary Technique for Coronary Heart Disease Prediction by Analyzing ECG Based on ResNet and Bi-LSTM
Authors: Yang Zhang, Jian He
Abstract:
Heart disease is one of the leading causes of death in the world, and coronary heart disease (CHD) is one of the major heart diseases. Electrocardiogram (ECG) is widely used in the detection of heart diseases, but the traditional manual method for CHD prediction by analyzing ECG requires lots of professional knowledge for doctors. This paper presents sliding window and continuous wavelet transform (CWT) to transform ECG signals into images, and then ResNet and Bi-LSTM are introduced to build the ECG feature extraction network (namely ECGNet). At last, an auxiliary system for CHD prediction was developed based on modified ResNet18 and Bi-LSTM, and the public ECG dataset of CHD from MIMIC-3 was used to train and test the system. The experimental results show that the accuracy of the method is 83%, and the F1-score is 83%. Compared with the available methods for CHD prediction based on ECG, such as kNN, decision tree, VGGNet, etc., this method not only improves the prediction accuracy but also could avoid the degradation phenomenon of the deep learning network.
Keywords: Bi-LSTM, CHD, coronary heart disease, ECG, electrocardiogram, ResNet, sliding window.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33610075 Photo Mosaic Smartphone Application in Client-Server Based Large-Scale Image Databases
Authors: Sang-Hun Lee, Bum-Soo Kim, Yang-Sae Moon, Jinho Kim
Abstract:
In this paper we present a photo mosaic smartphone application in client-server based large-scale image databases. Photo mosaic is not a new concept, but there are very few smartphone applications especially for a huge number of images in the client-server environment. To support large-scale image databases, we first propose an overall framework working as a client-server model. We then present a concept of image-PAA features to efficiently handle a huge number of images and discuss its lower bounding property. We also present a best-match algorithm that exploits the lower bounding property of image-PAA. We finally implement an efficient Android-based application and demonstrate its feasibility.Keywords: smartphone applications; photo mosaic; similarity search; data mining; large-scale image databases.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167110074 A Distributed Weighted Cluster Based Routing Protocol for Manets
Authors: Naveen Chauhan, L.K. Awasthi, Narottam chand, Vivek Katiyar, Ankit Chug
Abstract:
Mobile ad-hoc networks (MANETs) are a form of wireless networks which do not require a base station for providing network connectivity. Mobile ad-hoc networks have many characteristics which distinguish them from other wireless networks which make routing in such networks a challenging task. Cluster based routing is one of the routing schemes for MANETs in which various clusters of mobile nodes are formed with each cluster having its own clusterhead which is responsible for routing among clusters. In this paper we have proposed and implemented a distributed weighted clustering algorithm for MANETs. This approach is based on combined weight metric that takes into account several system parameters like the node degree, transmission range, energy and mobility of the nodes. We have evaluated the performance of proposed scheme through simulation in various network situations. Simulation results show that proposed scheme outperforms the original distributed weighted clustering algorithm (DWCA).Keywords: MANETs, Clustering, Routing, WirelessCommunication, Distributed Clustering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189110073 Metaheuristic Algorithms for Decoding Binary Linear Codes
Authors: Hassan Berbia, Faissal Elbouanani, Rahal Romadi, Mostafa Belkasmi
Abstract:
This paper introduces two decoders for binary linear codes based on Metaheuristics. The first one uses a genetic algorithm and the second is based on a combination genetic algorithm with a feed forward neural network. The decoder based on the genetic algorithms (DAG) applied to BCH and convolutional codes give good performances compared to Chase-2 and Viterbi algorithm respectively and reach the performances of the OSD-3 for some Residue Quadratic (RQ) codes. This algorithm is less complex for linear block codes of large block length; furthermore their performances can be improved by tuning the decoder-s parameters, in particular the number of individuals by population and the number of generations. In the second algorithm, the search space, in contrast to DAG which was limited to the code word space, now covers the whole binary vector space. It tries to elude a great number of coding operations by using a neural network. This reduces greatly the complexity of the decoder while maintaining comparable performances.Keywords: Block code, decoding, methaheuristic, genetic algorithm, neural network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 208210072 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper proposes how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 262410071 Generating Class-Based Test Cases for Interface Classes of Object-Oriented Black Box Frameworks
Authors: Jehad Al Dallal, Paul Sorenson
Abstract:
An application framework provides a reusable design and implementation for a family of software systems. Application developers extend the framework to build their particular applications using hooks. Hooks are the places identified to show how to use and customize the framework. Hooks define the Framework Interface Classes (FICs) and their possible specifications, which helps in building reusable test cases for the implementations of these classes. This paper introduces a novel technique called all paths-state to generate state-based test cases to test the FICs at class level. The technique is experimentally evaluated. The empirical evaluation shows that all paths-state technique produces test cases with a high degree of coverage for the specifications of the implemented FICs comparing to test cases generated using round-trip path and all-transition techniques.Keywords: Hooks, object-oriented framework, frameworkinterface classes (FICs), specification-based testing, test casegeneration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 132710070 Properties of Biodiesel Produced by Enzymatic Transesterification of Lipids Extracted from Microalgae in Supercritical Carbon Dioxide Medium
Authors: Hanifa Taher, Sulaiman Al-Zuhair, Ali H. Al-Marzouqi, Yousef Haik, Mohammed Farid
Abstract:
Biodiesel, as an alternative renewable fuel, has been receiving increasing attention due to the limited supply of fossil fuels and the increasing need for energy. Microalgae are promising source for lipids, which can be converted to biodiesel. The biodiesel production from microalgae lipids using lipase catalyzed reaction in supercritical CO2 medium has several advantages over conventional production processes. However, identifying the optimum microalgae lipid extraction and transesterification conditions is still a challenge. In this study, the quality of biodiesel produced from lipids extracted from Scenedesmus sp. and their enzymatic transesterification using supercritical carbon dioxide have been investigated. At the optimum conditions, the highest biodiesel production yield was found to be 82%. The fuel properties of the produced biodiesel, without any separation step, at optimum reaction condition, were determined and compared to ASTM standards. The properties were found to comply with the limits, and showed a low glycerol content, without any separation step.Keywords: Biodiesel, fuel standards, lipase, microalgae, Supercritical CO2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 250810069 Recycling Organic Waste in Suan Sunandha Rajabhat University as Compost
Authors: Anat Thapinta
Abstract:
This research aimed to study on the potential of recycling organic waste in Suan Sunandha Rajabhat University as compost. In doing so, the composition of solid waste generated in the campus was investigated while physical and chemical properties of organic waste were analyzed in order to evaluate the portion of waste suitable for recycling as compost. As a result of the study, it was found that (1) the amount of organic waste was averaged at 299.8 kg/day in which mixed food wastes had the highest amount of 191.9 kg/day followed by mixed leave & yard wastes and mixed fruit & vegetable wastes at the amount of 66.3 and 41.6 kg/day respectively; (2) physical and chemical properties of organic waste in terms of moisture content was between 69.54 to 78.15%, major elements for plant as N, P and K were 0.14 to 0.17%, 0.46 to 0.52% and 0.16 to 0.18% respectively, and carbon/nitrogen ratio (C/N) was about 15:1 to 17.5:1; (3) recycling organic waste as compost was designed by aerobic decomposition using mixed food wastes : mixed leave & yard wastes : mixed fruit & vegetable wastes at the portion of 3:2:1 by weight in accordance with the potential of their amounts and their physical and chemical properties.Keywords: Compost, Organic waste, Physical and chemical properties, Recycling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181310068 Website Evaluation of Travel Agencies Class A in Saudi Arabia and Egypt Using Extended Version of Internet Commerce Adoption Model: A Comparative Study
Authors: Tarek Abdel Azim Ahmed, Eman Sarhan Shaker
Abstract:
This research aims to explore how well the extended model of internet commerce adoption (eMICA) model is often used to determine the extent of internet commerce adoption in the travel agencies sector in both Egypt and Kingdom of Saudi Arabia (KSA). The web content analysis method was used to analyze the level of adoption of Egyptian travel agencies and Saudi travel agencies according to data immensely available on their websites. Therefore, each site was categorized according to the phases and levels proposed. In order to achieve this, 120 websites were evaluated by the two authors over a three-month period, from August to October 2020, and then categorized according to the phases and levels of (eMICA). The results show that there are deficiencies in the application of the eMICA model by both KSA and Egyptian travel agencies, generally, updating their websites, the absence of quality certification, offering secure online payment, virtual tours, and videos using Flash animation. In general, the Egyptian companies slightly outperformed the KSA ones in applying eMICA model.
Keywords: e-commerce, eMICA, Internet marketing, travel agencies, websites.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65910067 A New Bound on the Average Information Ratio of Perfect Secret-Sharing Schemes for Access Structures Based On Bipartite Graphs of Larger Girth
Authors: Hui-Chuan Lu
Abstract:
In a perfect secret-sharing scheme, a dealer distributes a secret among a set of participants in such a way that only qualified subsets of participants can recover the secret and the joint share of the participants in any unqualified subset is statistically independent of the secret. The access structure of the scheme refers to the collection of all qualified subsets. In a graph-based access structures, each vertex of a graph G represents a participant and each edge of G represents a minimal qualified subset. The average information ratio of a perfect secret-sharing scheme realizing a given access structure is the ratio of the average length of the shares given to the participants to the length of the secret. The infimum of the average information ratio of all possible perfect secret-sharing schemes realizing an access structure is called the optimal average information ratio of that access structure. We study the optimal average information ratio of the access structures based on bipartite graphs. Based on some previous results, we give a bound on the optimal average information ratio for all bipartite graphs of girth at least six. This bound is the best possible for some classes of bipartite graphs using our approach.
Keywords: Secret-sharing scheme, average information ratio, star covering, deduction, core cluster.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143410066 Polishing Machine Based on High-Pressure Water Jet
Authors: Mohammad A. Khasawneh
Abstract:
The design of high pressure water jet based polishing equipment and its fabrication conducted in this study is reported herein, together with some preliminary test results for assessing its applicability for HMA surface polishing. This study also provides preliminary findings concerning the test variables, such as the rotational speed, the water jet pressure, the abrasive agent used, and the impact angel that were experimentally investigated in this study. The preliminary findings based on four trial tests (two on large slab specimens and two on small size gyratory compacted specimens), however, indicate that both friction and texture values tend to increase with the polishing durations for two combinations of pressure and rotation speed of the rotary deck. It seems that the more polishing action the specimen is subjected to; the aggregate edges are created such that the surface texture values are increased with the accompanied increase in friction values. It may be of interest (but which is outside the scope of this study) to investigate if the similar trend exist for HMA prepared with aggregate source that is sand and gravel.Keywords: High-pressure, water jet, Friction, Texture, Polishing, Statistical Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205410065 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach
Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park
Abstract:
As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.
Keywords: CO2 emissions, performance based design, optimization, sustainable design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186810064 A Case Study in Using the Can-Sized Satellite Platforms for Interdisciplinary Problem-Based Learning in Aeronautical and Electronic Engineering
Authors: Michael Johnson, Vincenzo Oliveri
Abstract:
This work considers an interdisciplinary Problem-Based Learning (PBL) project developed by lecturers from the Aeronautical and Electronic and Computer Engineering departments at the University of Limerick. This “CANSAT” project utilises the CanSat can-sized satellite platform in order to allow students from aeronautical and electronic engineering to engage in a mixed format (online/face-to-face), interdisciplinary PBL assignment using a real-world platform and application. The project introduces students to the design, development, and construction of the CanSat system over the course of a single semester, enabling student(s) to apply their aeronautical and technical skills/capabilities to the realisation of a working CanSat system. In this case study, the CanSat kits are used to pivot the real-world, discipline-relevant PBL goal of designing, building, and testing the CanSat system with payload(s) from a traditional module-based setting to an online PBL setting. Feedback, impressions, benefits, and challenges identified through the semester are presented. Students found the project to be interesting and rewarding, with the interdisciplinary nature of the project appealing to them. Challenges and difficulties encountered are also addressed, with solutions developed between the students and facilitators to overcoming these discussed.
Keywords: Problem-Based Learning, Online PBL, Electronic Engineering, Aeronautical Engineering, Interdisciplinary Project, CanSat.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47110063 Evaluation on Mechanical Stabilities of Clay-Sand Mixtures Used as Engineered Barrier for Radioactive Waste Disposal
Authors: Ahmet E. Osmanlioglu
Abstract:
In this study, natural bentonite was used as natural clay material and samples were taken from the Kalecik district in Ankara. In this research, bentonite is the subject of an analysis from standpoint of assessing the basic properties of engineered barriers with respect to the buffer material. Bentonite and sand mixtures were prepared for tests. Some of clay minerals give relatively higher hydraulic conductivity and lower swelling pressure. Generally, hydraulic conductivity of these type clays is lower than <10-12 m/s. The hydraulic properties of clay-sand mixtures are evaluated to design engineered barrier specifications. Hydraulic conductivities of bentonite-sand mixture were found in the range of 1.2x10-10 to 9.3x10-10 m/s. Optimum B/S mixture ratio was determined as 35% in terms of hydraulic conductivity and mechanical stability. At the second stage of this study, all samples were compacted into cylindrical shape molds (diameter: 50 mm and length: 120 mm). The strength properties of compacted mixtures were better than the compacted bentonite. In addition, the larger content of the quartz sand in the mixture has the greater thermal conductivity.Keywords: Bentonite, hydraulic conductivity, clay, nuclear waste disposal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142010062 Proffering a Brand New Methodology to Resource Discovery in Grid based on Economic Criteria Using Learning Automata
Authors: Ali Sarhadi, Mohammad Reza Meybodi, Ali Yousefi
Abstract:
Resource discovery is one of the chief services of a grid. A new approach to discover the provenances in grid through learning automata has been propounded in this article. The objective of the aforementioned resource-discovery service is to select the resource based upon the user-s applications and the mercantile yardsticks that is to say opting for an originator which can accomplish the user-s tasks in the most economic manner. This novel service is submitted in two phases. We proffered an applicationbased categorization by means of an intelligent nerve-prone plexus. The user in question sets his or her application as the input vector of the nerve-prone nexus. The output vector of the aforesaid network limns the appropriateness of any one of the resource for the presented executive procedure. The most scrimping option out of those put forward in the previous stage which can be coped with to fulfill the task in question is picked out. Te resource choice is carried out by means of the presented algorithm based upon the learning automata.
Keywords: Resource discovery, learning automata, neural network, economic policy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145310061 Analysing the Elementary Science and Technology Coursebook and Student Workbook in Terms of Constructivism
Authors: Nil Duban
Abstract:
The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.
Keywords: Constructivism, coursebooks, science and technology education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962