Search results for: array based primer extension
27646 A Relational Case-Based Reasoning Framework for Project Delivery System Selection
Authors: Yang Cui, Yong Qiang Chen
Abstract:
An appropriate project delivery system (PDS) is crucial to the success of a construction project. Case-based reasoning (CBR) is a useful support for PDS selection. However, the traditional CBR approach represents cases as attribute-value vectors without taking relations among attributes into consideration, and could not calculate the similarity when the structures of cases are not strictly same. Therefore, this paper solves this problem by adopting the relational case-based reasoning (RCBR) approach for PDS selection, considering both the structural similarity and feature similarity. To develop the feature terms of the construction projects, the criteria and factors governing PDS selection process are first identified. Then, feature terms for the construction projects are developed. Finally, the mechanism of similarity calculation and a case study indicate how RCBR works for PDS selection. The adoption of RCBR in PDS selection expands the scope of application of traditional CBR method and improves the accuracy of the PDS selection system.Keywords: relational cased-based reasoning, case-based reasoning, project delivery system, PDS selection
Procedia PDF Downloads 43227645 Dynamic Model for Forecasting Rainfall Induced Landslides
Authors: R. Premasiri, W. A. H. A. Abeygunasekara, S. M. Hewavidana, T. Jananthan, R. M. S. Madawala, K. Vaheeshan
Abstract:
Forecasting the potential for disastrous events such as landslides has become one of the major necessities in the current world. Most of all, the landslides occurred in Sri Lanka are found to be triggered mostly by intense rainfall events. The study area is the landslide near Gerandiella waterfall which is located by the 41st kilometer post on Nuwara Eliya-Gampala main road in Kotmale Division in Sri Lanka. The landslide endangers the entire Kotmale town beneath the slope. Geographic Information System (GIS) platform is very much useful when it comes to the need of emulating the real-world processes. The models are used in a wide array of applications ranging from simple evaluations to the levels of forecast future events. This project investigates the possibility of developing a dynamic model to map the spatial distribution of the slope stability. The model incorporates several theoretical models including the infinite slope model, Green Ampt infiltration model and Perched ground water flow model. A series of rainfall values can be fed to the model as the main input to simulate the dynamics of slope stability. Hydrological model developed using GIS is used to quantify the perched water table height, which is one of the most critical parameters affecting the slope stability. Infinite slope stability model is used to quantify the degree of slope stability in terms of factor of safety. DEM was built with the use of digitized contour data. Stratigraphy was modeled in Surfer using borehole data and resistivity images. Data available from rainfall gauges and piezometers were used in calibrating the model. During the calibration, the parameters were adjusted until a good fit between the simulated ground water levels and the piezometer readings was obtained. This model equipped with the predicted rainfall values can be used to forecast of the slope dynamics of the area of interest. Therefore it can be investigated the slope stability of rainfall induced landslides by adjusting temporal dimensions.Keywords: factor of safety, geographic information system, hydrological model, slope stability
Procedia PDF Downloads 42327644 Graphen-Based Nanocomposites for Glucose and Ethanol Enzymatic Biosensor Fabrication
Authors: Tesfaye Alamirew, Delele Worku, Solomon W. Fanta, Nigus Gabbiye
Abstract:
Recently graphen based nanocomposites are become an emerging research areas for fabrication of enzymatic biosensors due to their property of large surface area, conductivity and biocompatibility. This review summarizes recent research reports of graphen based nanocomposites for the fabrication of glucose and ethanol enzymatic biosensors. The newly fabricated enzyme free microwave treated nitrogen doped graphen (MN-d-GR) had provided highest sensitivity towards glucose and GCE/rGO/AuNPs/ADH composite had provided far highest sensitivity towards ethanol compared to other reported graphen based nanocomposites. The MWCNT/GO/GOx and GCE/ErGO/PTH/ADH nanocomposites had also enhanced wide linear range for glucose and ethanol detection respectively. Generally, graphen based nanocomposite enzymatic biosensors had fast direct electron transfer rate, highest sensitivity and wide linear detection ranges during glucose and ethanol sensing.Keywords: glucose, ethanol, enzymatic biosensor, graphen, nanocomposite
Procedia PDF Downloads 12627643 Seismic Refraction and Resistivity Survey of Ini Local Government Area, South-South Nigeria: Assessing Structural Setting and Groundwater Potential
Authors: Mfoniso Udofia Aka
Abstract:
A seismic refraction and resistivity survey was conducted in Ini Local Government Area, South-South Nigeria, to evaluate the structural setting and groundwater potential. The study involved 20 Vertical Electrical Soundings (VES) using an ABEM Terrameter with a Schlumberger array and a 400-meter electrode spread, analyzed with WinResist software. Concurrently, 20 seismic refraction surveys were performed with a Geometric ES 3000 12-Channel seismograph, employing a 60-meter slant interval. The survey identified three distinct geological layers: top, middle, and lower. Seismic velocities (Vp) ranged from 209 to 500 m/s in the top layer, 221 to 1210 m/s in the middle layer, and 510 to 1700 m/s in the lower layer. Secondary seismic velocities (Vs) ranged from 170 to 410 m/s in the topsoil, 205 to 880 m/s in the middle layer, and 480 to 1120 m/s in the lower layer. Poisson’s ratios varied from -0.029 to -7.709 for the top layer, -0.027 to -6.963 for the middle layer, and -0.144 to -6.324 for the lower layer. The depths of these layers were approximately 1.0 to 3.0 meters for the top layer, 4.0 to 12.0 meters for the middle layer, and 8.0 to 14.5 meters for the lower layer. The topsoil consists of a surficial layer overlaid by reddish/clayey laterite and fine to medium coarse-grained sandy material, identified as the auriferous zone. Resistivity values were 1300 to 3215 Ωm for the topsoil, 720 to 1600 Ωm for the laterite, and 100 to 1350 Ωm for the sandy zone. Aquifer thickness and depth varied, with shallow aquifers ranging from 4.5 to 15.2 meters, medium-depth aquifers from 15.5 to 70.0 meters, and deep aquifers from 4.0 to 70.0 meters. Locations 1, 15, and 13 exhibited favorable water potential with shallow formations, while locations 5, 11, 9, and 14 showed less potential due to the lack of fractured or weathered zones. The auriferous sandy zone indicated significant potential for industrial development. Future surveys should consider using a more robust energy source to enhance data acquisition and accuracy.Keywords: hydrogeological, aquifer, seismic section geo-electric section, stratigraphy
Procedia PDF Downloads 3027642 A Geospatial Analysis of Residential Conservation-Attitude, Intention and Behavior
Authors: Prami Sengupta, Randall A. Cantrell, Tracy Johns
Abstract:
A typical US household consumes more energy than households in other countries and is directly responsible for a considerable proportion of the atmospheric concentration of the greenhouse gases. This makes U.S. household a vital target group for energy conservation studies. Positive household behavior is central to residential energy conservation. However, for individuals to conserve energy they must not only know how to conserve energy but be also willing to do so. That is, a positive attitude towards residential conservation and an intention to conserve energy are two of the most important psychological determinants for energy conservation behavior. Most social science studies, to date, have studied the relationships between attitude, intention, and behavior by building upon socio-psychological theories of behavior. However, these frameworks, including the widely used Theory of Planned Behavior and Social Cognitive Theory, lack a spatial component. That is, these studies fail to capture the impact of the geographical locations of homeowners’ residences on their residential energy consumption and conservation practices. Therefore, the purpose of this study is to explore geospatial relationships between homeowners’ residential energy conservation-attitudes, conservation-intentions, and consumption behavior. The study analyzes residential conservation-attitudes and conservation-intentions of homeowners across 63 counties in Florida and compares it with quantifiable measures of residential energy consumption. Empirical findings revealed that the spatial distribution of high and/or low values of homeowners’ mean-score values of conservation-attitudes and conservation-intentions are more spatially clustered than would be expected if the underlying spatial processes were random. On the contrary, the spatial distribution of high and/or low values of households’ carbon footprints was found to be more spatially dispersed than assumed if the underlying spatial process were random. The study also examined the influence of potential spatial variables, such as urban or rural setting and presence of educational institutions and/or extension program, on the conservation-attitudes, intentions, and behaviors of homeowners.Keywords: conservation-attitude, conservation-intention, geospatial analysis, residential energy consumption, spatial autocorrelation
Procedia PDF Downloads 19227641 Flow Reproduction Using Vortex Particle Methods for Wake Buffeting Analysis of Bluff Structures
Authors: Samir Chawdhury, Guido Morgenthal
Abstract:
The paper presents a novel extension of Vortex Particle Methods (VPM) where the study aims to reproduce a template simulation of complex flow field that is generated from impulsively started flow past an upstream bluff body at certain Reynolds number Re-Vibration of a structural system under upstream wake flow is often considered its governing design criteria. Therefore, the attention is given in this study especially for the reproduction of wake flow simulation. The basic methodology for the implementation of the flow reproduction requires the downstream velocity sampling from the template flow simulation; therefore, at particular distances from the upstream section the instantaneous velocity components are sampled using a series of square sampling-cells arranged vertically where each of the cell contains four velocity sampling points at its corner. Since the grid free Lagrangian VPM algorithm discretises vorticity on particle elements, the method requires transformation of the velocity components into vortex circulation, and finally the simulation of the reproduction of the template flow field by seeding these vortex circulations or particles into a free stream flow. It is noteworthy that the vortex particles have to be released into the free stream exactly at same rate of velocity sampling. Studies have been done, specifically, in terms of different sampling rates and velocity sampling positions to find their effects on flow reproduction quality. The quality assessments are mainly done, using a downstream flow monitoring profile, by comparing the characteristic wind flow profiles using several statistical turbulence measures. Additionally, the comparisons are performed using velocity time histories, snapshots of the flow fields, and the vibration of a downstream bluff section by performing wake buffeting analyses of the section under the original and reproduced wake flows. Convergence study is performed for the validation of the method. The study also describes the possibilities how to achieve flow reproductions with less computational effort.Keywords: vortex particle method, wake flow, flow reproduction, wake buffeting analysis
Procedia PDF Downloads 31127640 The Impact of Diversification Strategy on Leverage and Accrual-Based Earnings Management
Authors: Safa Lazzem, Faouzi Jilani
Abstract:
The aim of this research is to investigate the impact of diversification strategy on the nature of the relationship between leverage and accrual-based earnings management through panel-estimation techniques based on a sample of 162 nonfinancial French firms indexed in CAC All-Tradable during the period from 2006 to 2012. The empirical results show that leverage increases encourage managers to manipulate earnings management. Our findings prove that the diversification strategy provides the needed context for this accounting practice to be possible in highly diversified firms. In addition, the results indicate that diversification moderates the relationship between leverage and accrual-based earnings management by changing the nature and the sign of this relationship.Keywords: diversification, earnings management, leverage, panel-estimation techniques
Procedia PDF Downloads 15027639 Comparing Student Performance on Paper-Based versus Computer-Based Formats of Standardized Tests
Authors: Jin Koo
Abstract:
During the coronavirus pandemic, there has been a further increasing demand for computer-based tests (CBT), and now it has become an important test mode. The main purpose of this study is to investigate the comparability of student scores obtained from computerized-based formats of a standardized test in the two subject areas of reading and mathematics. Also, this study investigates whether there is an interaction effect between test modes of CBT and paper-based tests (PBT) and gender/ability level in each subject area. The test used in this study is a multiple-choice standardized test for students in grades 8-11. For this study, data were collected during four test administrations: 2015-16, 2017-18, and 2020-21. This research used a one-factor between-subjects ANOVA to compute the PBT and CBT groups’ test means for each subject area (reading and mathematics). Also, 2-factor between-subjects ANOVAs were conducted to investigate examinee characteristics: gender (male and female), ethnicity (African-American, Asian, Hispanic, multi-racial, and White), and ability level (low, average, and high-ability groups). The author found that students’ test scores in the two subject areas varied across CBT and PBT by gender and ability level, meaning that gender, ethnicity, and ability level were related to the score difference. These results will be discussed according to the current testing systems. In addition, this study’s results will open up to school teachers and test developers the possible influence that gender, ethnicity, and ability level have on a student’s score based on whether they take the CBT or PBT.Keywords: ability level, computer-based, gender, paper-based, test
Procedia PDF Downloads 10027638 Content-Based Color Image Retrieval Based on the 2-D Histogram and Statistical Moments
Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed
Abstract:
In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach can overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.Keywords: 2-D histogram, statistical moments, indexing, similarity distance, histograms intersection
Procedia PDF Downloads 45727637 Seismic Performance of Concrete Moment Resisting Frames in Western Canada
Authors: Ali Naghshineh, Ashutosh Bagchi
Abstract:
Performance-based seismic design concepts are increasingly being adopted in various jurisdictions. While the National Building Code of Canada (NBCC) is not fully performance-based, it provides some features of a performance-based code, such as displacement control and objective-based solutions. Performance evaluation is an important part of a performance-based design. In this paper, the seismic performance of a set of code-designed 4, 8 and 12 story moment resisting concrete frames located in Victoria, BC, in the western part of Canada at different hazard levels namely, SLE (Service Level Event), DLE (Design Level Event) and MCE (Maximum Considered Event) has been studied. The seismic performance of these buildings has been evaluated based on FEMA 356 and ATC 72 procedures, and the nonlinear time history analysis. Pushover analysis has been used to investigate the different performance levels of these buildings and adjust their design based on the corresponding target displacements. Since pushover analysis ignores the higher mode effects, nonlinear dynamic time history using a set of ground motion records has been performed. Different types of ground motion records, such as crustal and subduction earthquake records have been used for the dynamic analysis to determine their effects. Results obtained from push over analysis on inter-story drift, displacement, shear and overturning moment are compared to those from the dynamic analysis.Keywords: seismic performance., performance-based design, concrete moment resisting frame, crustal earthquakes, subduction earthquakes
Procedia PDF Downloads 26427636 The Composer’s Hand: An Analysis of Arvo Pärt’s String Orchestral Work, Psalom
Authors: Mark K. Johnson
Abstract:
Arvo Pärt has composed over 80 text-based compositions based on nine different languages. But prior to 2015, it was not publicly known what texts the composer used in composing a number of his non-vocal works, nor the language of those texts. Because of this lack of information, few if any musical scholars have illustrated in any detail how textual structure applies to any of Pärt’s instrumental compositions. However, in early 2015, the Arvo Pärt Centre in Estonia published In Principio, a compendium of the texts Pärt has used to derive many of the parameters of his text-based compositions. This paper provides the first detailed analysis of the relationship between structural aspects of the Church Slavonic Eastern Orthodox text of Psalm 112 and the musical parameters that Pärt used when composing the string orchestral work Psalom. It demonstrates that Pärt’s text-based compositions are carefully crafted works, and that evidence of the presence of the ‘invisible’ hand of the composer can be found within every aspect of the underpinning structures, at the more elaborate middle ground level, and even within surface aspects of these works. Based on the analysis of Psalom, it is evident that the text Pärt selected for Psalom informed many of his decisions regarding the musical structures, parameters and processes that he deployed in composing this non-vocal text-based work. Many of these composerly decisions in relation to these various aspects cannot be fathomed without access to, and an understanding of, the text associated with the work.Keywords: Arvo Pärt, minimalism, psalom, text-based process music
Procedia PDF Downloads 23427635 Deep Learning Based-Object-classes Semantic Classification of Arabic Texts
Authors: Imen Elleuch, Wael Ouarda, Gargouri Bilel
Abstract:
We proposes in this paper a Deep Learning based approach to classify text in order to enrich an Arabic ontology based on the objects classes of Gaston Gross. Those object classes are defined by taking into account the syntactic and semantic features of the treated language. Thus, our proposed approach is a hybrid one. In fact, it is based on the one hand on the object classes that represents a knowledge based-approach on classification of text and in the other hand it uses the deep learning approach that use the word embedding-based-approach to classify text. We have applied our proposed approach on a corpus constructed from an Arabic dictionary. The obtained semantic classification of text will enrich the Arabic objects classes ontology. In fact, new classes can be added to the ontology or an expansion of the features that characterizes each object class can be updated. The obtained results are compared to a similar work that treats the same object with a classical linguistic approach for the semantic classification of text. This comparison highlight our hybrid proposed approach that can be ameliorated by broaden the dataset used in the deep learning process.Keywords: deep-learning approach, object-classes, semantic classification, Arabic
Procedia PDF Downloads 8827634 Understanding English Language in Career Development of Academics in Non-English Speaking HEIs: A Systematic Literature Review
Authors: Ricardo Pinto Mario Covele, Patricio V. Langa, Patrick Swanzy
Abstract:
The English language has been recognized as a universal medium of instruction in academia, especially in Higher Education Institutions (HEIs) hence exerting enormous influence within the context of research and publication. By extension, the English Language has been embraced by scholars from non-English speaking countries. The purpose of this review was to synthesize the discussions using four databases. Discussion in the English language in the career development of academics, particularly in non-English speaking universities, is largely less visible. This paper seeks to fill this gap and to improve the visibility of the English language in the career development of academics focusing on non-English language speaking universities by undertaking a systematic literature review. More specifically, the paper addresses the language policy, English language learning model as a second language, sociolinguistic field and career development, methods, as well as its main findings. This review analyzed 75 relevant resources sourced from Western Cape’s Library, Scopus, Google scholar, and web of science databases from November 2020 to July 2021 using the PQRS framework as an analytical lens. The paper’s findings demonstrate that, while higher education continues to be under-challenges of English language usage, literature targeting non-English speaking universities remains less discussed than it is often described. The findings also demonstrate the dominance of English language policy, both for knowledge production and dissemination of literature challenging emerging scholars from non-English speaking HEIs. Hence, the paper argues for the need to reconsider the context of non-English language speakers in the English language in the career development of academics’ research, both as empirical fields and as emerging knowledge producers. More importantly, the study reveals two bodies of literature: (1) the instrumentalist approach to English Language learning and (2) Intercultural approach to the English Language for career opportunities, classified as the appropriate to explain the English language learning process and how is it perceived towards scholars’ academic careers in HEIs.Keywords: English language, public and private universities, language policy, career development, non-English speaking countries
Procedia PDF Downloads 15427633 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals
Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor
Abstract:
This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers
Procedia PDF Downloads 7627632 Hardware Co-Simulation Based Based Direct Torque Control for Induction Motor Drive
Authors: Hanan Mikhael Dawood, Haider Salim, Jafar Al-Wash
Abstract:
This paper presents Proportional-Integral (PI) controller to improve the system performance which gives better torque and flux response. In addition, it reduces the undesirable torque ripple. The conventional DTC controller approach for induction machines, based on an improved torque and stator flux estimator, is implemented using Xilinx System Generator (XSG) for MATLAB/Simulink environment through Xilinx blocksets. The design was achieved in VHDL which is based on a MATLAB/Simulink simulation model. The hardware in the loop results are obtained considering the implementation of the proposed model on the Xilinx NEXYS2 Spartan 3E1200 FG320 Kit.Keywords: induction motor, Direct Torque Control (DTC), Xilinx FPGA, motor drive
Procedia PDF Downloads 62227631 Developed Text-Independent Speaker Verification System
Authors: Mohammed Arif, Abdessalam Kifouche
Abstract:
Speech is a very convenient way of communication between people and machines. It conveys information about the identity of the talker. Since speaker recognition technology is increasingly securing our everyday lives, the objective of this paper is to develop two automatic text-independent speaker verification systems (TI SV) using low-level spectral features and machine learning methods. (i) The first system is based on a support vector machine (SVM), which was widely used in voice signal processing with the aim of speaker recognition involving verifying the identity of the speaker based on its voice characteristics, and (ii) the second is based on Gaussian Mixture Model (GMM) and Universal Background Model (UBM) to combine different functions from different resources to implement the SVM based.Keywords: speaker verification, text-independent, support vector machine, Gaussian mixture model, cepstral analysis
Procedia PDF Downloads 5827630 A Neural Approach for Color-Textured Images Segmentation
Authors: Khalid Salhi, El Miloud Jaara, Mohammed Talibi Alaoui
Abstract:
In this paper, we present a neural approach for unsupervised natural color-texture image segmentation, which is based on both Kohonen maps and mathematical morphology, using a combination of the texture and the image color information of the image, namely, the fractal features based on fractal dimension are selected to present the information texture, and the color features presented in RGB color space. These features are then used to train the network Kohonen, which will be represented by the underlying probability density function, the segmentation of this map is made by morphological watershed transformation. The performance of our color-texture segmentation approach is compared first, to color-based methods or texture-based methods only, and then to k-means method.Keywords: segmentation, color-texture, neural networks, fractal, watershed
Procedia PDF Downloads 34827629 Improved Clothing Durability as a Lifespan Extension Strategy: A Framework for Measuring Clothing Durability
Authors: Kate E Morris, Mark Sumner, Mark Taylor, Amanda Joynes, Yue Guo
Abstract:
Garment durability, which encompasses physical and emotional factors, has been identified as a critical ingredient in producing clothing with increased lifespans, battling overconsumption, and subsequently tackling the catastrophic effects of climate change. Eco-design for Sustainable Products Regulation (ESPR) and Extended Producer Responsibility (EPR) schemes have been suggested and will be implemented across Europe and the UK which might require brands to declare a garment’s durability credentials to be able to sell in that market. There is currently no consistent method of measuring the overall durability of a garment. Measuring the physical durability of garments is difficult and current assessment methods lack objectivity and reliability or don’t reflect the complex nature of durability for different garment categories. This study presents a novel and reproducible methodology for testing and ranking the absolute durability of 5 commercially available garment types, Formal Trousers, Casual Trousers, Denim Jeans, Casual Leggings and Underwear. A total of 112 garments from 21 UK brands were assessed. Due to variations in end use, different factors were considered across the different garment categories when evaluating durability. A physical testing protocol was created, tailored to each category, to dictate the necessary test results needed to measure the absolute durability of the garments. Multiple durability factors were used to modulate the ranking as opposed to previous studies which only reported on single factors to evaluate durability. The garments in this study were donated by the signatories of the Waste Resource Action Programme’s (WRAP) Textile 2030 initiative as part of their strategy to reduce the environmental impact of UK fashion. This methodology presents a consistent system for brands and policymakers to follow to measure and rank various garment type’s physical durability. Furthermore, with such a methodology, the durability of garments can be measured and new standards for improving durability can be created to enhance utilisation and improve the sustainability of the clothing on the market.Keywords: circularity, durability, garment testing, ranking
Procedia PDF Downloads 3727628 Dynamics of Agricultural Information and Effect on Income of Melon Farmers in Enugu Ezike Agricultural Zone of Enugu State, Nigeria
Authors: Iwuchukwu J. C., Ekeh G. Madukwe, M. C., Asadu A. N.
Abstract:
Melon has significant importance of easy to plant, early maturing, low nutrient requirement and high yielding. Yet many melon farmers in the study area are either diversifying or abandoning this enterprise probably because of lack of agricultural knowledge/information and consequent reduction in output and income. The study was therefore carried out to asses effects of agricultural information on income of melon farmers in Enugu-Ezike Agricultural zone of Enugu state, Nigeria. Three blocks, nine circles and ninety melon farmers who were purposively selected constituted the sample for the study..Data were collected with interview schedule. Percentage and chart were used to present some of the data while some were analysed with mean score and correlation. The findings reveal that. average annual income of these respondents from melon was about seven thousand and five hundred Naira (approximately forty five Dollars). while their total average monthly income (income from melon and other sources) was about one thousand and two hundred Naira (approximately seven Dollars). About 42.% and 62% of the respondents in their respective order did not receive information on agricultural matters and melon production. Among the minority that received information on melon production, most of them sourced it from neighbours/friends/relatives. Majority of the respondents needed information on how to plant melon through interpersonal contact (face to face) using Igbo language as medium of communication and extension agent as teacher or resource person. The study also reveal a significant and positive relationship between number of times respondents received information on agriculture and their total monthly income. There was also a strong, positive and significant relationship between number of times respondents received information on melon and their annual income on melon production. The study therefore recommends that governmental and non-governmental organizations/ institutions should strengthen these farmers access to information on agriculture and melon specifically so as to boost their output and income.Keywords: farmers, income, information, melon
Procedia PDF Downloads 24627627 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We present a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.Keywords: computational finance, sentiment analysis, sentiment lexicon, stock movement prediction
Procedia PDF Downloads 12827626 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We introduce a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.Keywords: computational finance, sentiment analysis, sentiment lexicon, stock movement prediction
Procedia PDF Downloads 17027625 A New Verification Based Congestion Control Scheme in Mobile Networks
Authors: P. K. Guha Thakurta, Shouvik Roy, Bhawana Raj
Abstract:
A congestion control scheme in mobile networks is proposed in this paper through a verification based model. The model proposed in this work is represented through performance metric like buffer Occupancy, latency and packet loss rate. Based on pre-defined values, each of the metric is introduced in terms of three different states. A Markov chain based model for the proposed work is introduced to monitor the occurrence of the corresponding state transitions. Thus, the estimation of the network status is obtained in terms of performance metric. In addition, the improved performance of our proposed model over existing works is shown with experimental results.Keywords: congestion, mobile networks, buffer, delay, call drop, markov chain
Procedia PDF Downloads 44127624 Implementing Activity-Based Costing in Architectural Aluminum Projects: Case Study and Lessons Learned
Authors: Amer Momani, Tarek Al-Hawari, Abdallah Alakayleh
Abstract:
This study explains how to construct an actionable activity-based costing and management system to accurately track and account the total costs of architectural aluminum projects. Two ABC models were proposed to accomplish this purpose. First, the learning and development model was introduced to examine how to apply an ABC model in an architectural aluminum firm for the first time and to be familiar with ABC concepts. Second, an actual ABC model was built on the basis of the results of the previous model to accurately trace the actual costs incurred on each project in a year, and to be able to provide a quote with the best trade-off between competitiveness and profitability. The validity of the proposed model was verified on a local architectural aluminum company.Keywords: activity-based costing, activity-based management, construction, architectural aluminum
Procedia PDF Downloads 10227623 Interpreting Form Based Code in Historic Residential Corridor
Authors: Diljan C. K.
Abstract:
Every location on the planet has a history and culture that give it its own identity and character, making it distinct from others. urbanised world, it is fashionable to remould its original character and impression in a contemporary style. The new character and impression of places show a complete detachment from their roots. The heritage and cultural values of the place are replaced by new impressions, and as a result, they eventually lose their identity and character and never have sustenance. In this situation, form-based coding acts as a tool in the urban design process, helping to come up with solutions that strongly bind individuals to their neighbourhood and are closely related to culture through the physical spaces they are associated with. Form-based code was made by pioneers of new urbanism in 1987 in the United States of America. Since then, it has been used in various projects inside and outside the USA with varied scales, from the design of a single building to the design of a whole community. This research makes an effort to interpret the form-based code in historic corridors to establish the association of physical form and space with the public realm to uphold the context and culture. Many of the historic corridors are undergoing a tremendous transformation in their physical form, avoiding their culture and context. This will lead to it losing its identity in form and function. If the case of Valiyashala in Trivandrum is taken as the case, which is transforming its form and will lead to the loss of its identity, the form-based code will be a suitable tool to strengthen its historical value. The study concludes by analysing the existing code (KMBR) of Valiyashala and form-based code to find the requirements in form-based code for Valiyashala.Keywords: form based code, urban conservation, heritage, historic corridor
Procedia PDF Downloads 10927622 Correlates of Income Generation of Small-Scale Fish Processors in Abeokuta Metropolis, Ogun State, Nigeria
Authors: Ayodeji Motunrayo Omoare
Abstract:
Economically fish provides an important source of food and income for both men and women especially many households in the developing world and fishing has an important social and cultural position in river-rine communities. However, fish is highly susceptible to deterioration. Consequently, this study was carried out to correlate income generation of small-scale women fish processors in Abeokuta metropolis, Ogun State, Nigeria. Eighty small-scale women fish processors were randomly selected from five communities as the sample size for this study. Collected data were analyzed using both descriptive and inferential statistics. The results showed that the mean age of the respondents was 31.75 years with average household size of 4 people while 47.5% of the respondents had primary education. Most (86.3%) of the respondents were married and had spent more than 11 years in fish processing. The respondents were predominantly Yoruba tribe (91.2%). Majority (71.3%) of the respondents used traditional kiln for processing their fish while 23.7% of the respondents used hot vegetable oil to fry their fish. Also, the result revealed that respondents sourced capital from Personal Savings (48.8%), Cooperatives (27.5%), Friends and Family (17.5%) and Microfinance Banks (6.2%) for fish processing activities. The respondents generated an average income of ₦7,000.00 from roasted fish, ₦3,500.00 from dried fish, and ₦5,200.00 from fried fish daily. However, inadequate processing equipment (95.0%), non-availability of credit facility from microfinance banks (85.0%), poor electricity supply (77.5%), inadequate extension service support (70.0%), and fuel scarcity (68.7%) were major constraints to fish processing in the study area. Results of chi-square analysis showed that there was a significant relationship between personal characteristics (χ2 = 36.83, df = 9), processing methods (χ2 = 15.88, df = 3) and income generated at p < 0.05 level of significance. It can be concluded that significant relationship existed between processing methods and income generated. The study, therefore, recommends that modern processing equipment should be made available to the respondents at a subsidized price by the agro-allied companies.Keywords: correlates, income, fish processors, women, small-scale
Procedia PDF Downloads 24727621 A Review on Light Shafts Rendering for Indoor Scenes
Authors: Hatam H. Ali, Mohd Shahrizal Sunar, Hoshang Kolivand, Mohd Azhar Bin M. Arsad
Abstract:
Rendering light shafts is one of the important topics in computer gaming and interactive applications. The methods and models that are used to generate light shafts play crucial role to make a scene more realistic in computer graphics. This article discusses the image-based shadows and geometric-based shadows that contribute in generating volumetric shadows and light shafts, depending on ray tracing, radiosity, and ray marching technique. The main aim of this study is to provide researchers with background on a progress of light scattering methods so as to make it available for them to determine the technique best suited to their goals. It is also hoped that our classification helps researchers find solutions to the shortcomings of each method.Keywords: shaft of lights, realistic images, image-based, and geometric-based
Procedia PDF Downloads 27927620 A Resource Based View: Perspective on Acquired Human Resource towards Competitive Advantage
Authors: Monia Hassan Abdulrahman
Abstract:
Resource-based view is built on many theories in addition to diverse perspectives, we extend this view placing emphasis on human resources addressing the tools required to sustain competitive advantage. Highlighting on several theories and judgments, assumptions were established to clearly reach if resource possession alone suffices for the sustainability of competitive advantage, or necessary accommodation are required for better performance. New practices were indicated in terms of resources used in firms, these practices were implemented on the human resources in particular, and results were developed in compliance to the mentioned assumptions. Such results drew attention to the significance of practices that provide enhancement of human resources that have a core responsibility of maintaining resource-based view for an organization to lead the way to gaining competitive advantage.Keywords: competitive advantage, resource based value, human resources, strategic management
Procedia PDF Downloads 39127619 Cryptanalysis of ID-Based Deniable Authentication Protocol Based On Diffie-Hellman Problem on Elliptic Curve
Authors: Eun-Jun Yoon
Abstract:
Deniable authentication protocol is a new security authentication mechanism which can enable a receiver to identify the true source of a given message, but not to prove the identity of the sender to a third party. In 2013, Kar proposed a secure ID-based deniable authentication protocol whose security is based on computational infeasibility of solving Elliptic Curve Diffie-Hellman Problem (ECDHP). Kar claimed that the proposed protocol achieves properties of deniable authentication, mutual authentication, and message confidentiality. However, this paper points out that Kar's protocol still suffers from sender spoofing attack and message modification attack unlike its claims.Keywords: deniable authentication, elliptic curve cryptography, Diffie-Hellman problem, cryptanalysis
Procedia PDF Downloads 33227618 Exploratory Characterization of Antibacterial Efficacy of Synthesized Nanoparticles on Staphylococcus Isolates from Hospital Specimens in Saudi Arabia
Authors: Reham K. Sebaih, Afaf I. Shehata , Awatif A. Hindi, Tarek Gheith, Amal A. Hazzani Anas Al-Orjan
Abstract:
Staphylococci spp are ubiquitous gram-positive bacteria is often associated with infections, especially nosocomial infections, and antibiotic resistanceStudy pathogenic bacteria and its use as a tool in the technology of Nano biology and molecular genetics research of the latest research trends of modern characterization and definition of different multiresistant of bacteria including Staphylococci. The Staphylococci are widespread all over the world and particularly in Saudi Arabia The present work study was conducted to evaluate the effect of five different types of nanoparticles (biosynthesized zinc oxide, Spherical and rod of each silver and gold nanoparticles) and their antibacterial impact on the Staphylococcus species. Ninety-six isolates of Staphylococcus species. Staphylococcus aureus, Staphylococcus epidermidis, MRSA were collected from different sources during the period between March 2011G to June 2011G. All isolates were isolated from inpatients and outpatients departments at Royal Commission Hospital in Yanbu Industrial, Saudi Arabia. High percentage isolation from males(55%) than females (45%). Staphylococcus epidermidis from males was (47%), (28%), and(25%). For Staphylococcus aureus and Methicillin-resistant Staphylococcus aureus (MRSA. Isolates from females were Staphylococcus aureus with higher percent of (47%), (30%), and (23%) for MRSA, Staphylococcus epidermidis. Staphylococcus aureus from wound swab were the highest percent (51.42%) followed by vaginal swab (25.71%). Staphylococcus epidermidis were founded with higher percentage in blood (37.14%) and wound swab (34.21%) respectively related to other. The highest percentage of methicillin-resistant Staphylococcus aureus (MRSA)(80.77%) were isolated from wound swab, while those from nostrils were (19.23%). Staphylococcus species were isolates in highest percentage from hospital Emergency department with Staphylococcus aureus (59.37%), Methicillin-resistant Staphylococcus aureus (MRSA) (28.13%)and Staphylococcus epidermidis (12.5%) respectively. Evaluate the antibacterial property of Zinc oxide, Silver, and Gold nanoparticles as an alternative to conventional antibacterial agents Staphylococci isolates from hospital sources we screened them. Gold and Silver rods Nanoparticles to be sensitive to all isolates of Staphylococcus species. Zinc oxide Nanoparticles gave sensitivity impact range(52%) and (48%). The Gold and Silver spherical nanoparticles did not showed any effect on Staphylococci species. Zinc Oxide Nanoparticles gave bactericidal impact (25%) and bacteriostatic impact (75%) for of Staphylococci species. Detecting the association of nanoparticles with Staphylococci isolates imaging by scanning electron microscope (SEM) of some bacteriostatic isolates for Zinc Oxide nanoparticles on Staphylococcus aureus, Staphylococcus epidermidis and Methicillin resistant Staphylococcus aureus(MRSA), showed some Overlapping Bacterial cells with lower their number and appearing some appendages with deformities in external shape. Molecular analysis was applied by Multiplex polymerase chain reaction (PCR) used for the identification of genes within Staphylococcal pathogens. A multiplex polymerase chain reaction (PCR) method has been developed using six primer pairs to detect different genes using 50bp and 100bp DNA ladder marker. The range of Molecular gene typing ranging between 93 bp to 326 bp for Staphylococcus aureus and Methicillin resistant Staphylococcus aureus by TSST-1,mecA,femA and eta, while the bands border were from 546 bp to 682 bp for Staphylococcus epidermidis using icaAB and atlE. Sixteen isolation of Staphylococcus aureus and Methicillin resistant Staphylococcus aureus were positive for the femA gene at 132bp,this allowed the using of this gene as an internal positive control, fifteen isolates of Staphylococcus aureus and Methicillin resistant Staphylococcus aureus were positive for mecA gene at163bp.This gene was responsible for antibiotic resistant Methicillin, Two isolates of Staphylococcus aureus and Methicillin resistant Staphylococcus aureus were positive for the TSST-1 gene at326bp which is responsible for toxic shock syndrome in some Staphylococcus species, None were positive for eta gene at 102bpto that was responsible for Exfoliative toxins. Six isolates of Staphylococcus epidermidis were positive for atlE gene at 682 bp which is responsible for the initial adherence, three isolates of Staphylococcus epidermidis were positive for icaAB gene at 546bp that are responsible for mediates the formation of the biofilm. In conclusion, this study demonstrates the ability of the detection of the genes to discriminate between infecting Staphylococcus strains and considered biological tests, they may potentiate the clinical criteria used for the diagnosis of septicemia or catheter-related infections.Keywords: multiplex polymerase chain reaction, toxic shock syndrome, Staphylococcus aureus, nosocomial infections
Procedia PDF Downloads 33927617 Prediction of Unsteady Heat Transfer over Square Cylinder in the Presence of Nanofluid by Using ANN
Authors: Ajoy Kumar Das, Prasenjit Dey
Abstract:
Heat transfer due to forced convection of copper water based nanofluid has been predicted by Artificial Neural network (ANN). The present nanofluid is formed by mixing copper nano particles in water and the volume fractions are considered here are 0% to 15% and the Reynolds number are kept constant at 100. The back propagation algorithm is used to train the network. The present ANN is trained by the input and output data which has been obtained from the numerical simulation, performed in finite volume based Computational Fluid Dynamics (CFD) commercial software Ansys Fluent. The numerical simulation based results are compared with the back propagation based ANN results. It is found that the forced convection heat transfer of water based nanofluid can be predicted correctly by ANN. It is also observed that the back propagation ANN can predict the heat transfer characteristics of nanofluid very quickly compared to standard CFD method.Keywords: forced convection, square cylinder, nanofluid, neural network
Procedia PDF Downloads 321