Search results for: process model
7946 Integrated Models of Reading Comprehension: Understanding to Impact Teaching: The Teacher’s Central Role
Authors: Sally A. Brown
Abstract:
Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aide teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.
Keywords: Explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107945 Factors Affecting Slot Machine Performance in an Electronic Gaming Machine Facility
Authors: Etienne Provencal, David L. St-Pierre
Abstract:
A facility exploiting only electronic gambling machines (EGMs) opened in 2007 in Quebec City, Canada under the name of Salons de Jeux du Québec (SdjQ). This facility is one of the first worldwide to rely on that business model. This paper models the performance of such EGMs. The interest from a managerial point of view is to identify the variables that can be controlled or influenced so that a comprehensive model can help improve the overall performance of the business. The EGM individual performance model contains eight different variables under study (Game Title, Progressive jackpot, Bonus Round, Minimum Coin-in, Maximum Coin-in, Denomination, Slant Top and Position). Using data from Quebec City’s SdjQ, a linear regression analysis explains 90.80% of the EGM performance. Moreover, results show a behavior slightly different than that of a casino. The addition of GameTitle as a factor to predict the EGM performance is one of the main contributions of this paper. The choice of the game (GameTitle) is very important. Games having better position do not have significantly better performance than games located elsewhere on the gaming floor. Progressive jackpots have a positive and significant effect on the individual performance of EGMs. The impact of BonusRound on the dependent variable is significant but negative. The effect of Denomination is significant but weakly negative. As expected, the Language of an EGMS does not impact its individual performance. This paper highlights some possible improvements by indicating which features are performing well. Recommendations are given to increase the performance of the EGMs performance.
Keywords: EGM, linear regression, model prediction, slot operations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15737944 Adaptive Algorithm to Predict the QoS of Web Processes and Workflows
Authors: Jorge Cardoso
Abstract:
Workflow Management Systems (WfMS) alloworganizations to streamline and automate business processes and reengineer their structure. One important requirement for this type of system is the management and computation of the Quality of Service(QoS) of processes and workflows. Currently, a range of Web processes and workflow languages exist. Each language can be characterized by the set of patterns they support. Developing andimplementing a suitable and generic algorithm to compute the QoSof processes that have been designed using different languages is a difficult task. This is because some patterns are specific to particular process languages and new patterns may be introduced in future versions of a language. In this paper, we describe an adaptive algorithm implemented to cope with these two problems. The algorithm is called adaptive since it can be dynamically changed as the patterns of a process language also change.
Keywords: quality of service, web processes, workflows, web services
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20217943 Interaction Effect of DGAT1 and Composite Genotype of Beta-Kappa Casein on Economic Milk Production Traits in Crossbred Holstein
Authors: A. Molee, N. Duanghaklang, P. Mernkrathoke
Abstract:
The objective was to determine the single gene and interaction effect of composite genotype of beta-kappa casein and DGAT1 gene on milk yield (MY) and milk composition, content of milk fat (%FAT), milk protein (%PRO), solid not fat (%SNF), and total solid (%TS) in crossbred Holstein cows. Two hundred and thirty- one cows were genotyped with PCR-RFLP for DGAT1 and composite genotype data of beta-kappa casein from previous work were used. Two model, (1), and (2), was used to estimate single gene effect, and interaction effect on the traits, respectively. The significance of interaction effects on all traits were detected. Most traits have consistent pattern of significant when model (1), and (2) were compared, except the effect of composite genotype of betakappa casein on %FAT, and the effect of DGAT1 on MY, which the significant difference was detected in only model (1).The results suggested that when the optimum of all traits was necessary, interaction effect should be concerned.Keywords: composite genotype of beta-kappa casein, DGAT1gene, Milk composition, Milk yield
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16957942 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers
Authors: Quanru Pan
Abstract:
How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19697941 Face Detection in Color Images using Color Features of Skin
Authors: Fattah Alizadeh, Saeed Nalousi, Chiman Savari
Abstract:
Because of increasing demands for security in today-s society and also due to paying much more attention to machine vision, biometric researches, pattern recognition and data retrieval in color images, face detection has got more application. In this article we present a scientific approach for modeling human skin color, and also offer an algorithm that tries to detect faces within color images by combination of skin features and determined threshold in the model. Proposed model is based on statistical data in different color spaces. Offered algorithm, using some specified color threshold, first, divides image pixels into two groups: skin pixel group and non-skin pixel group and then based on some geometric features of face decides which area belongs to face. Two main results that we received from this research are as follow: first, proposed model can be applied easily on different databases and color spaces to establish proper threshold. Second, our algorithm can adapt itself with runtime condition and its results demonstrate desirable progress in comparison with similar cases.Keywords: face detection, skin color modeling, color, colorfulimages, face recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23227940 High Level Synthesis of Kahn Process Networks(KPN) for Streaming Applications
Authors: Attiya Mahmood, Syed Ali Abbas, Shoab A. Khan
Abstract:
Streaming Applications usually run in parallel or in series that incrementally transform a stream of input data. It poses a design challenge to break such an application into distinguishable blocks and then to map them into independent hardware processing elements. For this, there is required a generic controller that automatically maps such a stream of data into independent processing elements without any dependencies and manual considerations. In this paper, Kahn Process Networks (KPN) for such streaming applications is designed and developed that will be mapped on MPSoC. This is designed in such a way that there is a generic Cbased compiler that will take the mapping specifications as an input from the user and then it will automate these design constraints and automatically generate the synthesized RTL optimized code for specified application. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18337939 Application of Medium High Hydrostatic Pressure in Preserving Textural Quality and Safety of Pineapple Compote
Authors: Nazim Uddin, Yohiko Nakaura, Kazutaka Yamamoto
Abstract:
Compote (fruit in syrup) of pineapple (Ananas comosus L. Merrill) is expected to have a high market potential as one of convenient ready-to-eat (RTE) foods worldwide. High hydrostatic pressure (HHP) in combination with low temperature (LT) was applied to the processing of pineapple compote as well as medium HHP (MHHP) in combination with medium-high temperature (MHT) since both processes can enhance liquid impregnation and inactivate microbes. MHHP+MHT (55 or 65 °C) process, as well as the HHP+LT process, has successfully inactivated the microbes in the compote to a non-detectable level. Although the compotes processed by MHHP+MHT or HHP+LT have lost the fresh texture as in a similar manner as those processed solely by heat, it was indicated that the texture degradations by heat were suppressed under MHHP. Degassing process reduced the hardness, while calcium (Ca) contributed to be retained hardness in MHT and MHHP+MHT processes. Electrical impedance measurement supported the damage due to degassing and heat. The color, Brix, and appearance were not affected by the processing methods significantly. MHHP+MHT and HHP+LT processes may be applicable to produce high-quality, safe RTE pineapple compotes. Further studies on the optimization of packaging and storage condition will be indispensable for commercialization.
Keywords: Compote of pineapple, ready-to-eat, medium high hydrostatic pressure, postharvest loss, and texture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8167938 An Adaptive Model for Blind Image Restoration using Bayesian Approach
Authors: S.K. Satpathy, S.K. Nayak, K. K. Nagwanshi, S. Panda, C. Ardil
Abstract:
Image restoration involves elimination of noise. Filtering techniques were adopted so far to restore images since last five decades. In this paper, we consider the problem of image restoration degraded by a blur function and corrupted by random noise. A method for reducing additive noise in images by explicit analysis of local image statistics is introduced and compared to other noise reduction methods. The proposed method, which makes use of an a priori noise model, has been evaluated on various types of images. Bayesian based algorithms and technique of image processing have been described and substantiated with experimentation using MATLAB.Keywords: Image Restoration, Probability DensityFunction (PDF), Neural Networks, Bayesian Classifier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22567937 Strategic Priority of Green ICT Policy in Korea: Applying Analytic Hierarchy Process
Authors: Yong Ho Shim, Ki Youn Kim, Ji Yeon Cho, Jin Kyung Park, Bong Gyou Lee
Abstract:
This study considers priorities of primary goals to increase policy efficiency of Green ICT. Recently several studies have been published that address how IT is linked to climate change. However, most of the previous studies are limited to Green ICT industrial statute and policy directions. This paper present Green ICT policy making processes systematically. As a result of the analysis of Korean Green ICT policy, the following emerged as important to accomplish for Green ICT policy: eco-friendliness, technology evolution, economic efficiency, energy efficiency, and stable supply of energy. This is an initial study analyzing Green ICT policy, which provides an academic framework that can be used a guideline to establish Green ICT policy.Keywords: AHP(Analytic Hierarchy Process), Case Study, Green ICT, Policy Priority
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22397936 Time Dependent Biodistribution Modeling of 177Lu-DOTATOC Using Compartmental Analysis
Authors: M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani, S. Zolghadri
Abstract:
In this study, 177Lu-DOTATOC was prepared under optimized conditions (radiochemical purity: > 99%, radionuclidic purity: > 99%). The percentage of injected dose per gram (%ID/g) was calculated for organs up to 168 h post injection. Compartmental model was applied to mathematical description of the drug behaviour in tissue at different times. The biodistribution data showed the significant excretion of the radioactivity from the kidneys. The adrenal and pancreas, as major expression sites for somatostatin receptor (SSTR), had significant uptake. A pharmacokinetic model of 177Lu-DOTATOC was presented by compartmental analysis which demonstrates the behavior of the complex.Keywords: Biodistribution, compartmental modeling, 177Lu, octreotide.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8297935 Using Speech Emotion Recognition as a Longitudinal Biomarker for Alzheimer’s Disease
Authors: Yishu Gong, Liangliang Yang, Jianyu Zhang, Zhengyu Chen, Sihong He, Xusheng Zhang, Wei Zhang
Abstract:
Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that affects millions of people worldwide and is characterized by cognitive decline and behavioral changes. People living with Alzheimer’s disease often find it hard to complete routine tasks. However, there are limited objective assessments that aim to quantify the difficulty of certain tasks for AD patients compared to non-AD people. In this study, we propose to use speech emotion recognition (SER), especially the frustration level as a potential biomarker for quantifying the difficulty patients experience when describing a picture. We build an SER model using data from the IEMOCAP dataset and apply the model to the DementiaBank data to detect the AD/non-AD group difference and perform longitudinal analysis to track the AD disease progression. Our results show that the frustration level detected from the SER model can possibly be used as a cost-effective tool for objective tracking of AD progression in addition to the Mini-Mental State Examination (MMSE) score.
Keywords: Alzheimer’s disease, Speech Emotion Recognition, longitudinal biomarker, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3057934 Performance Analysis of Reconstruction Algorithms in Diffuse Optical Tomography
Authors: K. Uma Maheswari, S. Sathiyamoorthy, G. Lakshmi
Abstract:
Diffuse Optical Tomography (DOT) is a non-invasive imaging modality used in clinical diagnosis for earlier detection of carcinoma cells in brain tissue. It is a form of optical tomography which produces gives the reconstructed image of a human soft tissue with by using near-infra-red light. It comprises of two steps called forward model and inverse model. The forward model provides the light propagation in a biological medium. The inverse model uses the scattered light to collect the optical parameters of human tissue. DOT suffers from severe ill-posedness due to its incomplete measurement data. So the accurate analysis of this modality is very complicated. To overcome this problem, optical properties of the soft tissue such as absorption coefficient, scattering coefficient, optical flux are processed by the standard regularization technique called Levenberg - Marquardt regularization. The reconstruction algorithms such as Split Bregman and Gradient projection for sparse reconstruction (GPSR) methods are used to reconstruct the image of a human soft tissue for tumour detection. Among these algorithms, Split Bregman method provides better performance than GPSR algorithm. The parameters such as signal to noise ratio (SNR), contrast to noise ratio (CNR), relative error (RE) and CPU time for reconstructing images are analyzed to get a better performance.
Keywords: Diffuse optical tomography, ill-posedness, Levenberg Marquardt method, Split Bregman, the Gradient projection for sparse reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16287933 The Sustainability of Public Debt in Taiwan
Authors: Chiung-Ju Huang
Abstract:
This study examines whether the Taiwan’s public debt is sustainable utilizing an unrestricted two-regime threshold autoregressive (TAR) model with an autoregressive unit root. The empirical results show that Taiwan’s public debt appears as a nonlinear series and is stationary in regime 1 but not in regime 2. This result implies that while Taiwan’s public debt was mostly sustainable over the 1996 to 2013 period examined in the study, it may no longer be sustainable in the most recent two years as the public debt ratio has increased cumulatively to 3.618%.
Keywords: Nonlinearity, public debt, sustainability, threshold autoregressive model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20417932 Analyzing the Market Growth in API Economy Using Time-Evolving Model
Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata
Abstract:
API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding dynamics of a market of API economy under strategies of participants is crucial to fully maximize the values of API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that, when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competitions and the profit of the platform increases.
Keywords: API Economy, ecosystem, platform, API providers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2667931 Associated Map and Inter-Purchase Time Model for Multiple-Category Products
Authors: Ching-I Chen
Abstract:
The continued rise of e-commerce is the main driver of the rapid growth of global online purchase. Consumers can nearly buy everything they want at one occasion through online shopping. The purchase behavior models which focus on single product category are insufficient to describe online shopping behavior. Therefore, analysis of multi-category purchase gets more and more popular. For example, market basket analysis explores customers’ buying tendency of the association between product categories. The information derived from market basket analysis facilitates to make cross-selling strategies and product recommendation system.
To detect the association between different product categories, we use the market basket analysis with the multidimensional scaling technique to build an associated map which describes how likely multiple product categories are bought at the same time. Besides, we also build an inter-purchase time model for associated products to describe how likely a product will be bought after its associated product is bought. We classify inter-purchase time behaviors of multi-category products into nine types, and use a mixture regression model to integrate those behaviors under our assumptions of purchase sequences. Our sample data is from comScore which provides a panelist-label database that captures detailed browsing and buying behavior of internet users across the United States. Finding the inter-purchase time from books to movie is shorter than the inter-purchase time from movies to books. According to the model analysis and empirical results, this research finally proposes the applications and recommendations in the management.
Keywords: Multiple-category purchase behavior, inter-purchase time, market basket analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18777930 Case-Based Reasoning Application to Predict Geological Features at Site C Dam Construction Project
Authors: S. Behnam Malekzadeh, I. Kerr, T. Kaempffer, T. Harper, A Watson
Abstract:
The Site C Hydroelectric dam is currently being constructed in north-eastern British Columbia on sub-horizontal sedimentary strata that dip approximately 15 meters from one bank of the Peace River to the other. More than 615 pressure sensors (Vibrating Wire Piezometers) have been installed on bedding planes (BPs) since construction began, with over 80 more planned before project completion. These pressure measurements are essential to monitor the stability of the rock foundation during and after construction and for dam safety purposes. BPs are identified by their clay gouge infilling, which varies in thickness from less than 1 to 20 mm and can be challenging to identify as the core drilling process often disturbs or washes away the gouge material. Without the use of depth predictions from nearby boreholes, stratigraphic markers, and downhole geophysical data, it is difficult to confidently identify BP targets for the sensors. In this paper, a Case-Based Reasoning (CBR) method was used to develop an empirical model called the Bedding Plane Elevation Prediction (BPEP) to help geologists and geotechnical engineers to predict geological features and BPs at new locations in a fast and accurate manner. To develop CBR, a database was developed based on 64 pressure sensors already installed on key bedding planes BP25, BP28, and BP31 on the Right Bank, including BP elevations and coordinates. 13 (20%) of the most recent cases were selected to validate and evaluate the accuracy of the developed model, while the similarity was defined as the distance between previous cases and recent cases to predict the depth of significant BPs. The average difference between actual BP elevations and predicted elevations for above BPs was ± 55 cm, while the actual results showed that 69% of predicted elevations were within ± 79 cm of actual BP elevations while 100% of predicted elevations for new cases were within ± 99 cm range. Eventually, the actual results will be used to develop the database and improve BPEP to perform as a learning machine to predict more accurate BP elevations for future sensor installations.
Keywords: Case-Based Reasoning, CBR, geological feature, geology, piezometer, pressure sensor, core logging, dam construction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707929 Simulation Model for Predicting Dengue Fever Outbreak
Authors: Azmi Ibrahim, Nor Azan Mat Zin, Noraidah Sahari Ashaari
Abstract:
Dengue fever is prevalent in Malaysia with numerous cases including mortality recorded over the years. Public education on the prevention of the desease through various means has been carried out besides the enforcement of legal means to eradicate Aedes mosquitoes, the dengue vector breeding ground. Hence, other means need to be explored, such as predicting the seasonal peak period of the dengue outbreak and identifying related climate factors contributing to the increase in the number of mosquitoes. Simulation model can be employed for this purpose. In this study, we created a simulation of system dynamic to predict the spread of dengue outbreak in Hulu Langat, Selangor Malaysia. The prototype was developed using STELLA 9.1.2 software. The main data input are rainfall, temperature and denggue cases. Data analysis from the graph showed that denggue cases can be predicted accurately using these two main variables- rainfall and temperature. However, the model will be further tested over a longer time period to ensure its accuracy, reliability and efficiency as a prediction tool for dengue outbreak.Keywords: dengue fever, prediction, system dynamic, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23417928 Simulating Dynamics of Thoracolumbar Spine Derived from Life MOD under Haptic Forces
Authors: K. T. Huynh, I. Gibson, W. F. Lu, B. N. Jagdish
Abstract:
In this paper, the construction of a detailed spine model is presented using the LifeMOD Biomechanics Modeler. The detailed spine model is obtained by refining spine segments in cervical, thoracic and lumbar regions into individual vertebra segments, using bushing elements representing the intervertebral discs, and building various ligamentous soft tissues between vertebrae. In the sagittal plane of the spine, constant force will be applied from the posterior to anterior during simulation to determine dynamic characteristics of the spine. The force magnitude is gradually increased in subsequent simulations. Based on these recorded dynamic properties, graphs of displacement-force relationships will be established in terms of polynomial functions by using the least-squares method and imported into a haptic integrated graphic environment. A thoracolumbar spine model with complex geometry of vertebrae, which is digitized from a resin spine prototype, will be utilized in this environment. By using the haptic technique, surgeons can touch as well as apply forces to the spine model through haptic devices to observe the locomotion of the spine which is computed from the displacement-force relationship graphs. This current study provides a preliminary picture of our ongoing work towards building and simulating bio-fidelity scoliotic spine models in a haptic integrated graphic environment whose dynamic properties are obtained from LifeMOD. These models can be helpful for surgeons to examine kinematic behaviors of scoliotic spines and to propose possible surgical plans before spine correction operations.Keywords: Haptic interface, LifeMOD, spine modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19107927 Optimal Control Strategies for Speed Control of Permanent-Magnet Synchronous Motor Drives
Authors: Roozbeh Molavi, Davood A. Khaburi
Abstract:
The permanent magnet synchronous motor (PMSM) is very useful in many applications. Vector control of PMSM is popular kind of its control. In this paper, at first an optimal vector control for PMSM is designed and then results are compared with conventional vector control. Then, it is assumed that the measurements are noisy and linear quadratic Gaussian (LQG) methodology is used to filter the noises. The results of noisy optimal vector control and filtered optimal vector control are compared to each other. Nonlinearity of PMSM and existence of inverter in its control circuit caused that the system is nonlinear and time-variant. With deriving average model, the system is changed to nonlinear time-invariant and then the nonlinear system is converted to linear system by linearization of model around average values. This model is used to optimize vector control then two optimal vector controls are compared to each other. Simulation results show that the performance and robustness to noise of the control system has been highly improved.Keywords: Kalman filter, Linear quadratic Gaussian (LQG), Linear quadratic regulator (LQR), Permanent-Magnet synchronousmotor (PMSM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30167926 Optimization and Determination of Process Parameters in Thin Film SOI Photo-BJMOSFET
Authors: Hai-Qing Xie, Yun Zeng, Yong-Hong Yan, Guo-Liang Zhang, Tai-Hong Wang
Abstract:
We propose photo-BJMOSFET (Bipolar Junction Metal-Oxide-Semiconductor Field Effect Transistor) fabricated on SOI film. ITO film is adopted in the device as gate electrode to reduce light absorption. I-V characteristics of photo-BJMOSFET obtained in dark (dark current) and under 570nm illumination (photo current) are studied furthermore to achieve high photo-to-dark-current contrast ratio. Two variables in the calculation were the channel length and the thickness of the film which were set equal to six different values, i.e., L=2, 4, 6, 8, 10, and 12μm and three different values, i.e., dsi =100, 200 and 300nm, respectively. The results indicate that the greatest photo-to-dark-current contrast ratio is achieved with L=10μm and dsi=200 nm at VGK=0.6V.
Keywords: Photo-to-dark-current contrast ratio, Photo-current, Dark-current, Process parameter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14607925 A Game-Based Product Modelling Environment for Non-Engineer
Authors: Guolong Zhong, Venkatesh Chennam Vijay, Ilias Oraifige
Abstract:
In the last 20 years, Knowledge Based Engineering (KBE) has shown its advantages in product development in different engineering areas such as automation, mechanical, civil and aerospace engineering in terms of digital design automation and cost reduction by automating repetitive design tasks through capturing, integrating, utilising and reusing the existing knowledge required in various aspects of the product design. However, in primary design stages, the descriptive information of a product is discrete and unorganized while knowledge is in various forms instead of pure data. Thus, it is crucial to have an integrated product model which can represent the entire product information and its associated knowledge at the beginning of the product design. One of the shortcomings of the existing product models is a lack of required knowledge representation in various aspects of product design and its mapping to an interoperable schema. To overcome the limitation of the existing product model and methodologies, two key factors are considered. First, the product model must have well-defined classes that can represent the entire product information and its associated knowledge. Second, the product model needs to be represented in an interoperable schema to ensure a steady data exchange between different product modelling platforms and CAD software. This paper introduced a method to provide a general product model as a generative representation of a product, which consists of the geometry information and non-geometry information, through a product modelling framework. The proposed method for capturing the knowledge from the designers through a knowledge file provides a simple and efficient way of collecting and transferring knowledge. Further, the knowledge schema provides a clear view and format on the data that needed to be gathered in order to achieve a unified knowledge exchange between different platforms. This study used a game-based platform to make product modelling environment accessible for non-engineers. Further the paper goes on to test use case based on the proposed game-based product modelling environment to validate the effectiveness among non-engineers.
Keywords: Game-based learning, knowledge based engineering, product modelling, design automation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7627924 An Exploratory Study in Nursing Education: Factors Influencing Nursing Students’ Acceptance of Mobile Learning
Authors: R. Abdulrahman, A. Eardley, A. Soliman
Abstract:
The proliferation in the development of mobile learning (m-learning) has played a vital role in the rapidly growing electronic learning market. This relatively new technology can help to encourage the development of in learning and to aid knowledge transfer a number of areas, by familiarizing students with innovative information and communications technologies (ICT). M-learning plays a substantial role in the deployment of learning methods for nursing students by using the Internet and portable devices to access learning resources ‘anytime and anywhere’. However, acceptance of m-learning by students is critical to the successful use of m-learning systems. Thus, there is a need to study the factors that influence student’s intention to use m-learning. This paper addresses this issue. It outlines the outcomes of a study that evaluates the unified theory of acceptance and use of technology (UTAUT) model as applied to the subject of user acceptance in relation to m-learning activity in nurse education. The model integrates the significant components across eight prominent user acceptance models. Therefore, a standard measure is introduced with core determinants of user behavioural intention. The research model extends the UTAUT in the context of m-learning acceptance by modifying and adding individual innovativeness (II) and quality of service (QoS) to the original structure of UTAUT. The paper goes on to add the factors of previous experience (of using mobile devices in similar applications) and the nursing students’ readiness (to use the technology) to influence their behavioural intentions to use m-learning. This study uses a technique called ‘convenience sampling’ which involves student volunteers as participants in order to collect numerical data. A quantitative method of data collection was selected and involves an online survey using a questionnaire form. This form contains 33 questions to measure the six constructs, using a 5-point Likert scale. A total of 42 respondents participated, all from the Nursing Institute at the Armed Forces Hospital in Saudi Arabia. The gathered data were then tested using a research model that employs the structural equation modelling (SEM), including confirmatory factor analysis (CFA). The results of the CFA show that the UTAUT model has the ability to predict student behavioural intention and to adapt m-learning activity to the specific learning activities. It also demonstrates satisfactory, dependable and valid scales of the model constructs. This suggests further analysis to confirm the model as a valuable instrument in order to evaluate the user acceptance of m-learning activity.
Keywords: Mobile learning, nursing institute, unified theory of acceptance and use of technology model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12197923 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy
Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie
Abstract:
In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.
Keywords: Data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25677922 Comparison of Welding Fumes Exposure during Standing and Sitting Welder’s Position
Authors: Azian Hariri, M. Z. M Yusof, A. M. Leman
Abstract:
Experimental study was conducted to assess personal welding fumes exposure toward welders during an aluminum metal inert gas (MIG) process. The welding process was carried out by a welding machine attached to a Computer Numerical Control (CNC) workbench. A dummy welder was used to replicate welder during welding works and was attached with sampling pumps and filter cassettes for welding fumes sampling. Direct reading instruments to measure air velocity, humidity, temperature and particulate matter with diameter size 10µm or less (PM10) were located behind the dummy welder and parallel to the neck collar level to make sure the measured welding fumes exposure were not being influenced by other factors. Welding fumes exposure during standing and sitting position with and without the usage of local exhaust ventilation (LEV) was investigated. Welding fume samples were then digested and analyzed by using inductively coupled plasma mass spectroscopy (ICP-MS) according to ASTM D7439-08 method. The results of the study showed the welding fume exposure during sitting was lower compared to standing position. LEV helped reduce aluminum and lead exposure to acceptable levels during standing position. However during sitting position reduction of exposure was smaller. It can be concluded that welder position and the correct positioning of LEV should be implemented for effective exposure reduction.
Keywords: ICP-MS, MIG process, personal sampling, welding fumes exposure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26167921 Sustainable Building Technologies for Post-Disaster Temporary Housing: Integrated Sustainability Assessment and Life Cycle Assessment
Authors: S. M. Amin Hosseini, Oriol Pons, Albert de la Fuente
Abstract:
After natural disasters, displaced people (DP) require important numbers of housing units, which have to be erected quickly due to emergency pressures. These tight timeframes can cause the multiplication of the environmental construction impacts. These negative impacts worsen the already high energy consumption and pollution caused by the building sector. Indeed, post-disaster housing, which is often carried out without pre-planning, usually causes high negative environmental impacts, besides other economic and social impacts. Therefore, it is necessary to establish a suitable strategy to deal with this problem which also takes into account the instability of its causes, like changing ratio between rural and urban population. To this end, this study aims to present a model that assists decision-makers to choose the most suitable building technology for post-disaster housing units. This model focuses on the alternatives sustainability and fulfillment of the stakeholders’ satisfactions. Four building technologies have been analyzed to determine the most sustainability technology and to validate the presented model. In 2003, Bam earthquake DP had their temporary housing units (THUs) built using these four technologies: autoclaved aerated concrete blocks (AAC), concrete masonry unit (CMU), pressed reeds panel (PR), and 3D sandwich panel (3D). The results of this analysis confirm that PR and CMU obtain the highest sustainability indexes. However, the second life scenario of THUs could have considerable impacts on the results.
Keywords: Sustainability, post-disaster temporary housing, integrated value model for sustainability assessment (MIVES), life cycle assessment (LCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16517920 Forecasting the Istanbul Stock Exchange National 100 Index Using an Artificial Neural Network
Authors: Birol Yildiz, Abdullah Yalama, Metin Coskun
Abstract:
Many studies have shown that Artificial Neural Networks (ANN) have been widely used for forecasting financial markets, because of many financial and economic variables are nonlinear, and an ANN can model flexible linear or non-linear relationship among variables. The purpose of the study was to employ an ANN models to predict the direction of the Istanbul Stock Exchange National 100 Indices (ISE National-100). As a result of this study, the model forecast the direction of the ISE National-100 to an accuracy of 74, 51%.Keywords: Artificial Neural Networks, Istanbul StockExchange, Non-linear Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22477919 Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors
Authors: Mohammad Amin Safi, Mahmud Ashrafizaadeh, Amir Ali Ashrafizaadeh
Abstract:
A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.Keywords: Lattice Boltzmann model, Graphical processing unit, Binary mixture diffusion, 2D flow simulations, Optimized algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15697918 Energy Budget Equation of Superfluid HVBK Model: LES Simulation
Authors: M. Bakhtaoui, L. Merahi
Abstract:
The reliability of the filtered HVBK model is now investigated via some large eddy simulations (LES) of freely decaying isotropic superfluid turbulence. For homogeneous turbulence at very high Reynolds numbers, comparison of the terms in the spectral kinetic energy budget equation indicates, in the energy-containing range, that the production and energy transfer effects become significant except for dissipation. In the inertial range, where the two fluids are perfectly locked, the mutual friction maybe neglected with respect to other terms. Also, the LES results for the other terms of the energy balance are presented.
Keywords: Superfluid turbulence, HVBK, Energy budget, Large Eddy Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20187917 Identification of an Unstable Nonlinear System: Quadrotor
Authors: Mauricio Pe˜na, Adriana Luna, Carol Rodr´ıguez
Abstract:
In the following article we begin from a multi-parameter unstable nonlinear model of a Quadrotor. We design a control to stabilize and assure the attitude of the device, starting off a linearized system at the equilibrium point of the null angles of Euler (hover), which provides us a control with limited capacities at small angles of rotation of the vehicle in three dimensions. In order to clear this obstacle, we propose the identification of models in different angles by means of simulations and the design of a controller specifically implemented for the identification task, that in future works will allow the development of controllers according to fast and agile angles of Euler for Quadrotor.
Keywords: Quadrotor, model, control, identification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2747