Search results for: measure valued process
17000 Modelling and Control of Milk Fermentation Process in Biochemical Reactor
Authors: Jožef Ritonja
Abstract:
The biochemical industry is one of the most important modern industries. Biochemical reactors are crucial devices of the biochemical industry. The essential bioprocess carried out in bioreactors is the fermentation process. A thorough insight into the fermentation process and the knowledge how to control it are essential for effective use of bioreactors to produce high quality and quantitatively enough products. The development of the control system starts with the determination of a mathematical model that describes the steady state and dynamic properties of the controlled plant satisfactorily, and is suitable for the development of the control system. The paper analyses the fermentation process in bioreactors thoroughly, using existing mathematical models. Most existing mathematical models do not allow the design of a control system for controlling the fermentation process in batch bioreactors. Due to this, a mathematical model was developed and presented that allows the development of a control system for batch bioreactors. Based on the developed mathematical model, a control system was designed to ensure optimal response of the biochemical quantities in the fermentation process. Due to the time-varying and non-linear nature of the controlled plant, the conventional control system with a proportional-integral-differential controller with constant parameters does not provide the desired transient response. The improved adaptive control system was proposed to improve the dynamics of the fermentation. The use of the adaptive control is suggested because the parameters’ variations of the fermentation process are very slow. The developed control system was tested to produce dairy products in the laboratory bioreactor. A carbon dioxide concentration was chosen as the controlled variable. The carbon dioxide concentration correlates well with the other, for the quality of the fermentation process in significant quantities. The level of the carbon dioxide concentration gives important information about the fermentation process. The obtained results showed that the designed control system provides minimum error between reference and actual values of carbon dioxide concentration during a transient response and in a steady state. The recommended control system makes reference signal tracking much more efficient than the currently used conventional control systems which are based on linear control theory. The proposed control system represents a very effective solution for the improvement of the milk fermentation process.Keywords: biochemical reactor, fermentation process, modelling, adaptive control
Procedia PDF Downloads 12916999 Development and Validation of a Quantitative Measure of Engagement in the Analysing Aspect of Dialogical Inquiry
Authors: Marcus Goh Tian Xi, Alicia Chua Si Wen, Eunice Gan Ghee Wu, Helen Bound, Lee Liang Ying, Albert Lee
Abstract:
The Map of Dialogical Inquiry provides a conceptual look at the underlying nature of future-oriented skills. According to the Map, learning is learner-oriented, with conversational time shifted from teachers to learners, who play a strong role in deciding what and how they learn. For example, in courses operating on the principles of Dialogical Inquiry, learners were able to leave the classroom with a deeper understanding of the topic, broader exposure to differing perspectives, and stronger critical thinking capabilities, compared to traditional approaches to teaching. Despite its contributions to learning, the Map is grounded in a qualitative approach both in its development and its application for providing feedback to learners and educators. Studies hinge on openended responses by Map users, which can be time consuming and resource intensive. The present research is motivated by this gap in practicality by aiming to develop and validate a quantitative measure of the Map. In addition, a quantifiable measure may also strengthen applicability by making learning experiences trackable and comparable. The Map outlines eight learning aspects that learners should holistically engage. This research focuses on the Analysing aspect of learning. According to the Map, Analysing has four key components: liking or engaging in logic, using interpretative lenses, seeking patterns, and critiquing and deconstructing. Existing scales of constructs (e.g., critical thinking, rationality) related to these components were identified so that the current scale could adapt items from. Specifically, items were phrased beginning with an “I”, followed by an action phrase, to fulfil the purpose of assessing learners' engagement with Analysing either in general or in classroom contexts. Paralleling standard scale development procedure, the 26-item Analysing scale was administered to 330 participants alongside existing scales with varying levels of association to Analysing, to establish construct validity. Subsequently, the scale was refined and its dimensionality, reliability, and validity were determined. Confirmatory factor analysis (CFA) revealed if scale items loaded onto the four factors corresponding to the components of Analysing. To refine the scale, items were systematically removed via an iterative procedure, according to their factor loadings and results of likelihood ratio tests at each step. Eight items were removed this way. The Analysing scale is better conceptualised as unidimensional, rather than comprising the four components identified by the Map, for three reasons: 1) the covariance matrix of the model specified for the CFA was not positive definite, 2) correlations among the four factors were high, and 3) exploratory factor analyses did not yield an easily interpretable factor structure of Analysing. Regarding validity, since the Analysing scale had higher correlations with conceptually similar scales than conceptually distinct scales, with minor exceptions, construct validity was largely established. Overall, satisfactory reliability and validity of the scale suggest that the current procedure can result in a valid and easy-touse measure for each aspect of the Map.Keywords: analytical thinking, dialogical inquiry, education, lifelong learning, pedagogy, scale development
Procedia PDF Downloads 9116998 Internet-Of-Things and Ergonomics, Increasing Productivity and Reducing Waste: A Case Study
Authors: V. Jaime Contreras, S. Iliana Nunez, S. Mario Sanchez
Abstract:
Inside a manufacturing facility, we can find innumerable automatic and manual operations, all of which are relevant to the production process. Some of these processes add more value to the products more than others. Manual operations tend to add value to the product since they can be found in the final assembly area o final operations of the process. In this areas, where a mistake or accident can increase the cost of waste exponentially. To reduce or mitigate these costly mistakes, one approach is to rely on automation to eliminate the operator from the production line - requires a hefty investment and development of specialized machinery. In our approach, the center of the solution is the operator through sufficient and adequate instrumentation, real-time reporting and ergonomics. Efficiency and reduced cycle time can be achieved thorough the integration of Internet-of-Things (IoT) ready technologies into assembly operations to enhance the ergonomics of the workstations. Augmented reality visual aids, RFID triggered personalized workstation dimensions and real-time data transfer and reporting can help achieve these goals. In this case study, a standard work cell will be used for real-life data acquisition and a simulation software to extend the data points beyond the test cycle. Three comparison scenarios will run in the work cell. Each scenario will introduce a dimension of the ergonomics to measure its impact independently. Furthermore, the separate test will determine the limitations of the technology and provide a reference for operating costs and investment required. With the ability, to monitor costs, productivity, cycle time and scrap/waste in real-time the ROI (return on investment) can be determined at the different levels to integration. This case study will help to show that ergonomics in the assembly lines can make significant impact when IoT technologies are introduced. Ergonomics can effectively reduce waste and increase productivity with minimal investment if compared with setting up to custom machine.Keywords: augmented reality visual aids, ergonomics, real-time data acquisition and reporting, RFID triggered workstation dimensions
Procedia PDF Downloads 21416997 Development and Validation of an Instrument Measuring the Coping Strategies in Situations of Stress
Authors: Lucie Côté, Martin Lauzier, Guy Beauchamp, France Guertin
Abstract:
Stress causes deleterious effects to the physical, psychological and organizational levels, which highlight the need to use effective coping strategies to deal with it. Several coping models exist, but they don’t integrate the different strategies in a coherent way nor do they take into account the new research on the emotional coping and acceptance of the stressful situation. To fill these gaps, an integrative model incorporating the main coping strategies was developed. This model arises from the review of the scientific literature on coping and from a qualitative study carried out among workers with low or high levels of stress, as well as from an analysis of clinical cases. The model allows one to understand under what circumstances the strategies are effective or ineffective and to learn how one might use them more wisely. It includes Specific Strategies in controllable situations (the Modification of the Situation and the Resignation-Disempowerment), Specific Strategies in non-controllable situations (Acceptance and Stubborn Relentlessness) as well as so-called General Strategies (Wellbeing and Avoidance). This study is intended to undertake and present the process of development and validation of an instrument to measure coping strategies based on this model. An initial pool of items has been generated from the conceptual definitions and three expert judges have validated the content. Of these, 18 items have been selected for a short form questionnaire. A sample of 300 students and employees from a Quebec university was used for the validation of the questionnaire. Concerning the reliability of the instrument, the indices observed following the inter-rater agreement (Krippendorff’s alpha) and the calculation of the coefficients for internal consistency (Cronbach's alpha) are satisfactory. To evaluate the construct validity, a confirmatory factor analysis using MPlus supports the existence of a model with six factors. The results of this analysis suggest also that this configuration is superior to other alternative models. The correlations show that the factors are only loosely related to each other. Overall, the analyses carried out suggest that the instrument has good psychometric qualities and demonstrates the relevance of further work to establish predictive validity and reconfirm its structure. This instrument will help researchers and clinicians better understand and assess coping strategies to cope with stress and thus prevent mental health issues.Keywords: acceptance, coping strategies, stress, validation process
Procedia PDF Downloads 33916996 Percentage Change in the Selected Skinfold Measurements of Male Students of University of Delhi Due to Progressive and Constant Load of Physical Training
Authors: Seema Kaushik
Abstract:
Skinfold measurements provide considerably meaningful and consistent information about subcutaneous fat and its distribution. Physical activities in the form of conditioning and/or training leads to various structural, functional and mechanical changes and numerous training programmes exist for the improvement of physical fitness, however, most of the studies are conducted on foreign soil with foreign population as sample, which may/may not be applicable to the Indian conditions. Moreover, there is not even a single training/ conditioning programme that caters to the need of male students of University of Delhi with regard to various skinfold thickness measurements. Hence, the present study aimed at studying the effect of progressive and constant load training on selected skinfold measurements of male students of University of Delhi in form of percentage change. The sample size for the study was 90 having three groups of male; 30 samples in each group (mean age = 20.04±0.49 years). The variables included triceps, sub-scapular, supra-iliac and calf skinfolds. The experimental design adopted for the study was multi-group repeated measure design. Three different groups were measured four times repeatedly at an interval of 6 weeks, on completion of each of the three meso-cycles. Standard landmarks and protocols were followed to measure the selected variables. Mean, standard deviation and percentage were computed to analyze the data statistically. The study concluded that both the progressive and constant load of physical training bring changes in the skinfold thickness measurements of male students of University of Delhi.Keywords: constant load, progressive load, physical training, skinfold measurements
Procedia PDF Downloads 32216995 3D Liver Segmentation from CT Images Using a Level Set Method Based on a Shape and Intensity Distribution Prior
Authors: Nuseiba M. Altarawneh, Suhuai Luo, Brian Regan, Guijin Tang
Abstract:
Liver segmentation from medical images poses more challenges than analogous segmentations of other organs. This contribution introduces a liver segmentation method from a series of computer tomography images. Overall, we present a novel method for segmenting liver by coupling density matching with shape priors. Density matching signifies a tracking method which operates via maximizing the Bhattacharyya similarity measure between the photometric distribution from an estimated image region and a model photometric distribution. Density matching controls the direction of the evolution process and slows down the evolving contour in regions with weak edges. The shape prior improves the robustness of density matching and discourages the evolving contour from exceeding liver’s boundaries at regions with weak boundaries. The model is implemented using a modified distance regularized level set (DRLS) model. The experimental results show that the method achieves a satisfactory result. By comparing with the original DRLS model, it is evident that the proposed model herein is more effective in addressing the over segmentation problem. Finally, we gauge our performance of our model against matrices comprising of accuracy, sensitivity and specificity.Keywords: Bhattacharyya distance, distance regularized level set (DRLS) model, liver segmentation, level set method
Procedia PDF Downloads 31316994 Music Genre Classification Based on Non-Negative Matrix Factorization Features
Authors: Soyon Kim, Edward Kim
Abstract:
In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)
Procedia PDF Downloads 30316993 Option Pricing Theory Applied to the Service Sector
Authors: Luke Miller
Abstract:
This paper develops an options pricing methodology to value strategic pricing strategies in the services sector. More specifically, this study provides a unifying taxonomy of current service sector pricing practices, frames these pricing decisions as strategic real options, demonstrates accepted option valuation techniques to assess service sector pricing decisions, and suggests future research areas where pricing decisions and real options overlap. Enhancing revenue in the service sector requires proactive decision making in a world of uncertainty. In an effort to strategically price service products, revenue enhancement necessitates a careful study of the service costs, customer base, competition, legalities, and shared economies with the market. Pricing decisions involve the quality of inputs, manpower, and best practices to maintain superior service. These decisions further hinge on identifying relevant pricing strategies and understanding how these strategies impact a firm’s value. A relatively new area of research applies option pricing theory to investments in real assets and is commonly known as real options. The real options approach is based on the premise that many corporate decisions to invest or divest in assets are simply an option wherein the firm has the right to make an investment without any obligation to act. The decision maker, therefore, has more flexibility and the value of this operating flexibility should be taken into consideration. The real options framework has already been applied to numerous areas including manufacturing, inventory, natural resources, research and development, strategic decisions, technology, and stock valuation. Additionally, numerous surveys have identified a growing need for the real options decision framework within all areas of corporate decision-making. Despite the wide applicability of real options, no study has been carried out linking service sector pricing decisions and real options. This is surprising given the service sector comprises 80% of the US employment and Gross Domestic Product (GDP). Identifying real options as a practical tool to value different service sector pricing strategies is believed to have a significant impact on firm decisions. This paper identifies and discusses four distinct pricing strategies available to the service sector from an options’ perspective: (1) Cost-based profit margin, (2) Increased customer base, (3) Platform pricing, and (4) Buffet pricing. Within each strategy lie several pricing tactics available to the service firm. These tactics can be viewed as options the decision maker has to best manage a strategic position in the market. To demonstrate the effectiveness of including flexibility in the pricing decision, a series of pricing strategies were developed and valued using a real options binomial lattice structure. The options pricing approach discussed in this study allows service firms to directly incorporate market-driven perspectives into the decision process and thus synchronizing service operations with organizational economic goals.Keywords: option pricing theory, real options, service sector, valuation
Procedia PDF Downloads 35516992 Sustainability Rating System for Infrastructure Projects in UAE
Authors: Amrutha Venugopal, Rabee Rustum
Abstract:
In spite of huge investments and the vital role infrastructure plays in the economy of UAE, the country has not yet developed an assessment scheme to measure the sustainability of infrastructure projects/development. The aim of this study was to develop a sustainability rating system for infrastructure projects in UAE using weighted indicator scoring. The identification of the list of 66 indicators was done by content analysis. The sources of content analysis were from government guidelines, research literature and sustainability rating system for infrastructure projects namely BCA Greenmark for Infrastructure (Singapore), ISCA (Australia) and Envision (USA). These indicators were shortlisted based on their relevance in the UAE. A mixture of qualitative and quantitative research methods is utilized to find the weightage to be applied to the indicators and to find suggestive measures to improve infrastructure sustainability in this region. Interviews and surveys were conducted with a good mix of experts from the industry. The data collected from the interviews were collated to provide suggestive measures for improving infrastructure sustainability. The collected survey data were analyzed using statistical analysis techniques to find the indicator weighing. The indicators were shortlisted by 75% to minimize the effort and investment into the process. The weighing of the deleted indicators was distributed among the critical clusters identified by Pareto analysis. Finally a simple Microsoft Excel tool was developed as the rating tool by using the calculated weighing for the indicators.Keywords: infrastructure, rating system, suggestive measures, sustainability, UAE
Procedia PDF Downloads 30516991 Changes in Textural Properties of Zucchini Slices with Deep-Fat-Frying
Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner
Abstract:
Changes in textural properties of zucchini slices under effects of frying conditions were investigated. Frying time and temperature were interested process variables like slice thickness. Slice thickness was studied at three levels (2, 3, and 4 mm). Frying process was performed at two temperature levels (160 and 180 °C) and each for five different process time periods (1, 2, 3, 5, 8 and 10 min). As frying oil sunflower oil was used. Before frying zucchini slices were thermally processes in boiling water for 90 seconds to inactivate at least 80% of plant’s enzymes. After thermal process, zucchini slices were fried in an industrial fryer at specified temperature and time pairs. Fried slices were subjected to textural profile analysis (TPA) to determine textural properties. In this extent hardness, elasticity, cohesion, chewiness, firmness values of slices were figured out. Statistical analysis indicated significant variations in the studied textural properties with process conditions (p < 0.05). Hardness and firmness were determined for fresh and thermally processes zucchini slices to compare each others. Differences in hardness and firmness of fresh, thermally processed and fried slices were found to be significant (p < 0.05). This project (113R015) has been supported by TUBITAK.Keywords: sunflower oil, hardness, firmness, slice thickness, frying temperature, frying time
Procedia PDF Downloads 44416990 A Software Product Engineering Process for Commercial Success in Start-Up and Cases
Authors: Javed Ahsan
Abstract:
Software engineers strive for technical sophistication with a dream of finding commercial success in their start-up business. But they may find their much technically sophisticated software products failing in industry in competition with lesser sophisticated products. This is because of not maintaining a clear focus on complimenting and leading commercial success through technical sophistication. This can be achieved through a software engineering specific product development process suggested in this paper. This process is about evolving a software product through specific phases and iterations until commercial triumph falls on software engineer’s feet.Keywords: software, product, engineering, commercialization, start-up, competitiveness, industry
Procedia PDF Downloads 35616989 Optimal Design for SARMA(P,Q)L Process of EWMA Control Chart
Authors: Yupaporn Areepong
Abstract:
The main goal of this paper is to study Statistical Process Control (SPC) with Exponentially Weighted Moving Average (EWMA) control chart when observations are serially-correlated. The characteristic of control chart is Average Run Length (ARL) which is the average number of samples taken before an action signal is given. Ideally, an acceptable ARL of in-control process should be enough large, so-called (ARL0). Otherwise it should be small when the process is out-of-control, so-called Average of Delay Time (ARL1) or a mean of true alarm. We find explicit formulas of ARL for EWMA control chart for Seasonal Autoregressive and Moving Average processes (SARMA) with Exponential white noise. The results of ARL obtained from explicit formula and Integral equation are in good agreement. In particular, this formulas for evaluating (ARL0) and (ARL1) be able to get a set of optimal parameters which depend on smoothing parameter (λ) and width of control limit (H) for designing EWMA chart with minimum of (ARL1).Keywords: average run length, optimal parameters, exponentially weighted moving average (EWMA), control chart
Procedia PDF Downloads 56016988 Bridging the Gap between M and E, and KM: Towards the Integration of Evidence-Based Information and Policy Decision-Making
Authors: Xueqing Ivy Chen, Christo De Coning
Abstract:
It is clear from practice that a gap exists between Result-Based Monitoring and Evaluation (RBME) as a discipline, and Knowledge Management (KM) on the other hand. Whereas various government departments have institutionalised these functions, KM and M&E has functioned in isolation from each other in a practical sense in the public sector. It’s therefore necessary to explore the relationship between KM and M&E and the necessity for integration, so that a convergence of these disciplines can be established. An integration of KM and M&E will lead to integration and improvement of evidence-based information and policy decision-making. M&E and KM process models are available but the complementarity between specific process steps of these process models are not exploited. A need exists to clarify the relationships between these functions in order to ensure evidence based information and policy decision-making. This paper will depart from the well-known policy process models, such as the generic model and consider recent on the interface between policy, M&E and KM.Keywords: result-based monitoring and evaluation, RBME, knowledge management, KM, evident based decision making, public policy, information systems, institutional arrangement
Procedia PDF Downloads 15216987 Semantic Platform for Adaptive and Collaborative e-Learning
Authors: Massra M. Sabeima, Myriam lamolle, Mohamedade Farouk Nanne
Abstract:
Adapting the learning resources of an e-learning system to the characteristics of the learners is an important aspect to consider when designing an adaptive e-learning system. However, this adaptation is not a simple process; it requires the extraction, analysis, and modeling of user information. This implies a good representation of the user's profile, which is the backbone of the adaptation process. Moreover, during the e-learning process, collaboration with similar users (same geographic province or knowledge context) is important. Productive collaboration motivates users to continue or not abandon the course and increases the assimilation of learning objects. The contribution of this work is the following: we propose an adaptive e-learning semantic platform to recommend learning resources to learners, using ontology to model the user profile and the course content, furthermore an implementation of a multi-agent system able to progressively generate the learning graph (taking into account the user's progress, and the changes that occur) for each user during the learning process, and to synchronize the users who collaborate on a learning object.Keywords: adaptative learning, collaboration, multi-agent, ontology
Procedia PDF Downloads 17616986 Synthesis of Methanol through Photocatalytic Conversion of CO₂: A Green Chemistry Approach
Authors: Sankha Chakrabortty, Biswajit Ruj, Parimal Pal
Abstract:
Methanol is one of the most important chemical products and intermediates. It can be used as a solvent, intermediate or raw material for a number of higher valued products, fuels or additives. From the last one decay, the total global demand of methanol has increased drastically which forces the scientists to produce a large amount of methanol from a renewable source to meet the global demand with a sustainable way. Different types of non-renewable based raw materials have been used for the synthesis of methanol on a large scale which makes the process unsustainable. In this circumstances, photocatalytic conversion of CO₂ into methanol under solar/UV excitation becomes a viable approach to give a sustainable production approach which not only meets the environmental crisis by recycling CO₂ to fuels but also reduces CO₂ amount from the atmosphere. Development of such sustainable production approach for CO₂ conversion into methanol still remains a major challenge in the current research comparing with conventional energy expensive processes. In this backdrop, the development of environmentally friendly materials, like photocatalyst has taken a great perspective for methanol synthesis. Scientists in this field are always concerned about finding an improved photocatalyst to enhance the photocatalytic performance. Graphene-based hybrid and composite materials with improved properties could be a better nanomaterial for the selective conversion of CO₂ to methanol under visible light (solar energy) or UV light. The present invention relates to synthesis an improved heterogeneous graphene-based photocatalyst with improved catalytic activity and surface area. Graphene with enhanced surface area is used as coupled material of copper-loaded titanium oxide to improve the electron capture and transport properties which substantially increase the photoinduced charge transfer and extend the lifetime of photogenerated charge carriers. A fast reduction method through H₂ purging has been adopted to synthesis improved graphene whereas ultrasonication based sol-gel method has been applied for the preparation of graphene coupled copper loaded titanium oxide with some enhanced properties. Prepared photocatalysts were exhaustively characterized using different characterization techniques. Effects of catalyst dose, CO₂ flow rate, reaction temperature and stirring time on the efficacy of the system in terms of methanol yield and productivity have been studied in the present study. The study shown that the newly synthesized photocatalyst with an enhanced surface resulting in a sustained productivity and yield of methanol 0.14 g/Lh, and 0.04 g/gcat respectively, after 3 h of illumination under UV (250W) at an optimum catalyst dosage of 10 g/L having 1:2:3 (Graphene: TiO₂: Cu) weight ratio.Keywords: renewable energy, CO₂ capture, photocatalytic conversion, methanol
Procedia PDF Downloads 10816985 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation
Authors: Rashmi Malik, Videep Mishra
Abstract:
The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.Keywords: iterative game design, generative design, gaming asset automation, generative game design
Procedia PDF Downloads 7016984 Developing an Instrument to Measure Teachers’ Self-Efficacy of Teaching Innovation Skills
Authors: Huda S. Al-Azmi
Abstract:
There is a growing consensus that adoption of teachers’ self-efficacy measurement tools help to assess teachers’ abilities in specific areas in order to improve their skills. As a result, different instruments to assess teachers’ ability were developed by academics and practitioners. However, many of these instruments focused either on general teaching skills, or on the other hand, were very specific to one subject. As such, these instruments do not offer a tool to measure the ability of teachers in teaching 21st century skills such as innovation skills. Teaching innovation skills helps to prepare students for lives and careers in the 21st century. The purpose of this study is to develop an instrument measuring teachers’ self-efficacy of teaching innovation skills related to the classroom context and evaluating the teachers’ beliefs regarding their ability in teaching innovation skills. To reach this goal, the 16-item instrument measures four dimensions of innovation skills: creativity, critical thinking, communication, and collaboration. 211 secondary-school teachers filled out the survey to quantitatively analyze the quality of the instrument. The instrument’s reliability and item analysis were measured by using jMetrik. The results concluded that the mean of self-efficacy ranged from 3 to 3.6 without extreme high or low self-efficacy scores. The discrimination analysis revealed that one item recorded a negative correlation with the total, and three items recorded low correlation with the total. The reliabilities of items ranged from 0.64 to 0.69 and the instrument needed a couple of revisions before practical use. The study concluded the need to discard one item and revise five items to increase the quality of the instrument for future work.Keywords: critical thinking, collaboration, innovation skills, self-efficacy
Procedia PDF Downloads 21416983 Incorporating Priority Round-Robin Scheduler to Sustain Indefinite Blocking Issue and Prioritized Processes in Operating System
Authors: Heng Chia Ying, Charmaine Tan Chai Nie, Burra Venkata Durga Kumar
Abstract:
Process scheduling is the method of process management that determines which process the CPU will proceed with for the next task and how long it takes. Some issues were found in process management, particularly for Priority Scheduling (PS) and Round Robin Scheduling (RR). The proposed recommendations made for IPRRS are to combine the strengths of both into a combining algorithm while they draw on others to compensate for each weakness. A significant improvement on the combining technique of scheduler, Incorporating Priority Round-Robin Scheduler (IPRRS) address an algorithm for both high and low priority task to sustain the indefinite blocking issue faced in the priority scheduling algorithm and minimize the average turnaround time (ATT) and average waiting time (AWT) in RR scheduling algorithm. This paper will delve into the simple rules introduced by IPRRS and enhancements that both PS and RR bring to the execution of processes in the operating system. Furthermore, it incorporates the best aspects of each algorithm to build the optimum algorithm for a certain case in terms of prioritized processes, ATT, and AWT.Keywords: round Robin scheduling, priority scheduling, indefinite blocking, process management, sustain, turnaround time
Procedia PDF Downloads 14816982 The Modelling of Real Time Series Data
Authors: Valeria Bondarenko
Abstract:
We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.Keywords: mathematical model, random process, Wiener process, fractional Brownian motion
Procedia PDF Downloads 35816981 Study on the Carboxymethylation of Glucomannan from Porang
Authors: Fadilah Fadilah, Sperisa Distantina, Santi T. Wijayanti, Rahmawati Andayani
Abstract:
Chemical modification process on glucomannan from porang via carboxymethylation have been conducted. The process was done in two stages, the alkalization, and the carboxymethylation. The alkalization was done by adding NaOH solution into the medium which was contained glucomannan and then stirred it in ambient temperature for thirty minutes. The carboxymethylation process was done by adding sodium mono chloroacetate solution into the alkalization product. The carboxymethylation process was conducted for a certain time, and the product was then analyzed for determining the degree of substitution. In this research, the influence of medium to the degree of substitution was studied. Three different medium were used, namely water, 70% ethanol, and 90% ethanol. The results show that 70% ethanol was a better medium than two others because give a higher degree of substitution. Using 70% ethanol as a medium, the experiments for studying the influence of temperature on the carboxymethylation stages were conducted. The results show that the degree of substitution at 65°C is higher than at 45°C.Keywords: carboxymethylation, degree of substitution, ethanol medium, glucomannan
Procedia PDF Downloads 22316980 Project Management at University: Towards an Evaluation Process around Cooperative Learning
Authors: J. L. Andrade-Pineda, J.M. León-Blanco, M. Calle, P. L. González-R
Abstract:
The enrollment in current Master's degree programs usually pursues gaining the expertise required in real-life workplaces. The experience we present here concerns the learning process of "Project Management Methodology (PMM)", around a cooperative/collaborative mechanism aimed at affording students measurable learning goals and providing the teacher with the ability of focusing on the weaknesses detected. We have designed a mixed summative/formative evaluation, which assures curriculum engage while enriches the comprehension of PMM key concepts. In this experience we converted the students into active actors in the evaluation process itself and we endowed ourselves as teachers with a flexible process in which along with qualifications (score), other attitudinal feedback arises. Despite the high level of self-affirmation on their discussion within the interactive assessment sessions, they ultimately have exhibited a great ability to review and correct the wrong reasoning when that was the case.Keywords: cooperative-collaborative learning, educational management, formative-summative assessment, leadership training
Procedia PDF Downloads 16916979 [Keynote Talk]: Machining Parameters Optimization with Genetic Algorithm
Authors: Dejan Tanikić, Miodrag Manić, Jelena Đoković, Saša Kalinović
Abstract:
This paper deals with the determination of the optimum machining parameters, according to the measured and modelled data of the cutting temperature and surface roughness, during the turning of the AISI 4140 steel. The high cutting temperatures are unwanted occurences in the metal cutting process. They impact negatively on the quality of the machined part. The machining experiments were performed using different cutting regimes (cutting speed, feed rate and depth of cut), with different values of the workpiece hardness, which causes different values of the measured cutting temperature as well as the measured surface roughness. The temperature and surface roughness data were modelled after that using Response Surface Methodology (RSM). The obtained RSM models are used in the process of optimization of the cutting regimes using the Genetic Algorithms (GA) tool, which enables the metal cutting process in the optimum conditions.Keywords: genetic algorithms, machining parameters, response surface methodology, turning process
Procedia PDF Downloads 18816978 Signal Strength Based Multipath Routing for Mobile Ad Hoc Networks
Authors: Chothmal
Abstract:
In this paper, we present a route discovery process which uses the signal strength on a link as a parameter of its inclusion in the route discovery method. The proposed signal-to-interference and noise ratio (SINR) based multipath reactive routing protocol is named as SINR-MP protocol. The proposed SINR-MP routing protocols has two following two features: a) SINR-MP protocol selects routes based on the SINR of the links during the route discovery process therefore it select the routes which has long lifetime and low frame error rate for data transmission, and b) SINR-MP protocols route discovery process is multipath which discovers more than one SINR based route between a given source destination pair. The multiple routes selected by our SINR-MP protocol are node-disjoint in nature which increases their robustness against link failures, as failure of one route will not affect the other route. The secondary route is very useful in situations where the primary route is broken because we can now use the secondary route without causing a new route discovery process. Due to this, the network overhead caused by a route discovery process is avoided. This increases the network performance greatly. The proposed SINR-MP routing protocol is implemented in the trail version of network simulator called Qualnet.Keywords: ad hoc networks, quality of service, video streaming, H.264/SVC, multiple routes, video traces
Procedia PDF Downloads 24916977 The Reuse of Household Waste in Natural Dyeing as a Tool for Upcycling
Authors: Juliana Bastos dos Santos, Francisca Dantas Mendes, Abdul Jabbar Mohammad Khatri, Adam Abdul Jabbar Khatri
Abstract:
This research aims to describe the experimentation of color extraction from household waste, for the application of the natural vegetable dyeing technique, as a more sustainable option for the upcycling process. Based on the research of the case study, this article intends to record the process of collecting the materials, extracting the colors and their applicability. The study aims to deepen the knowledge about possible alternatives that generate less impact on the environment throughout the process of plant stamping and, also, to spread the concepts of sustainability in fashion. Therefore, this content becomes relevant for valuing an artisanal production process, reconnecting with ancestral knowledge. This article also intends to serve as a record of ancestral artisanal processes, based on the indigenous and African matrices that are pillars of Brazilian culture.Keywords: natural dyeing, sustainability, organic residue, fashion, reuse
Procedia PDF Downloads 17916976 Incorporation of Copper for Performance Enhancement in Metal-Oxides Resistive Switching Device and Its Potential Electronic Application
Authors: B. Pavan Kumar Reddy, P. Michael Preetam Raj, Souri Banerjee, Souvik Kundu
Abstract:
In this work, the fabrication and characterization of copper-doped zinc oxide (Cu:ZnO) based memristor devices with aluminum (Al) and indium tin oxide (ITO) metal electrodes are reported. The thin films of Cu:ZnO was synthesized using low-cost and low-temperature chemical process. The Cu:ZnO was then deposited onto ITO bottom electrodes using spin-coater technique, whereas the top electrode Al was deposited utilizing physical vapor evaporation technique. Ellipsometer was employed in order to measure the Cu:ZnO thickness and it was found to be 50 nm. Several surface and materials characterization techniques were used to study the thin-film properties of Cu:ZnO. To ascertain the efficacy of Cu:ZnO for memristor applications, electrical characterizations such as current-voltage (I-V), data retention and endurance were obtained, all being the critical parameters for next-generation memory. The I-V characteristic exhibits switching behavior with asymmetrical hysteresis loops. This work imputes the resistance switching to the positional drift of oxygen vacancies associated with respect to the Al/Cu:ZnO junction. Further, a non-linear curve fitting regression techniques were utilized to determine the equivalent circuit for the fabricated Cu:ZnO memristors. Efforts were also devoted in order to establish its potentiality for different electronic applications.Keywords: copper doped, metal-oxides, oxygen vacancies, resistive switching
Procedia PDF Downloads 16216975 The Effect of Improvement Programs in the Mean Time to Repair and in the Mean Time between Failures on Overall Lead Time: A Simulation Using the System Dynamics-Factory Physics Model
Authors: Marcel Heimar Ribeiro Utiyama, Fernanda Caveiro Correia, Dario Henrique Alliprandini
Abstract:
The importance of the correct allocation of improvement programs is of growing interest in recent years. Due to their limited resources, companies must ensure that their financial resources are directed to the correct workstations in order to be the most effective and survive facing the strong competition. However, to our best knowledge, the literature about allocation of improvement programs does not analyze in depth this problem when the flow shop process has two capacity constrained resources. This is a research gap which is deeply studied in this work. The purpose of this work is to identify the best strategy to allocate improvement programs in a flow shop with two capacity constrained resources. Data were collected from a flow shop process with seven workstations in an industrial control and automation company, which process 13.690 units on average per month. The data were used to conduct a simulation with the System Dynamics-Factory Physics model. The main variables considered, due to their importance on lead time reduction, were the mean time between failures and the mean time to repair. The lead time reduction was the output measure of the simulations. Ten different strategies were created: (i) focused time to repair improvement, (ii) focused time between failures improvement, (iii) distributed time to repair improvement, (iv) distributed time between failures improvement, (v) focused time to repair and time between failures improvement, (vi) distributed time to repair and between failures improvement, (vii) hybrid time to repair improvement, (viii) hybrid time between failures improvements, (ix) time to repair improvement strategy towards the two capacity constrained resources, (x) time between failures improvement strategy towards the two capacity constrained resources. The ten strategies tested are variations of the three main strategies for improvement programs named focused, distributed and hybrid. Several comparisons among the effect of the ten strategies in lead time reduction were performed. The results indicated that for the flow shop analyzed, the focused strategies delivered the best results. When it is not possible to perform a large investment on the capacity constrained resources, companies should use hybrid approaches. An important contribution to the academy is the hybrid approach, which proposes a new way to direct the efforts of improvements. In addition, the study in a flow shop with two strong capacity constrained resources (more than 95% of utilization) is an important contribution to the literature. Another important contribution is the problem of allocation with two CCRs and the possibility of having floating capacity constrained resources. The results provided the best improvement strategies considering the different strategies of allocation of improvement programs and different positions of the capacity constrained resources. Finally, it is possible to state that both strategies, hybrid time to repair improvement and hybrid time between failures improvement, delivered best results compared to the respective distributed strategies. The main limitations of this study are mainly regarding the flow shop analyzed. Future work can further investigate different flow shop configurations like a varying number of workstations, different number of products or even different positions of the two capacity constrained resources.Keywords: allocation of improvement programs, capacity constrained resource, hybrid strategy, lead time, mean time to repair, mean time between failures
Procedia PDF Downloads 12416974 Virtual Metrology for Copper Clad Laminate Manufacturing
Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho
Abstract:
In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology
Procedia PDF Downloads 35016973 Benefit Of Waste Collection Route Optimisation
Authors: Bojana Tot, Goran BošKović, Goran Vujić
Abstract:
Route optimisation is a process of planning one or multiple routes, with the purpose of minimizing overall costs, while achieving the highest possible performance under a set of given constraints. It combines routing or route planning, which is the process of creating the most cost-effective route by minimizing the distance or travelled time necessary to reach a set of planned stops, and route scheduling, which is the process of assigning an arrival and service time for each stop, with drivers being given shifts that adhere to their working hours. The objective of this paper is to provide benefits on the implementation of waste collection route optimisation and thus achieve economic efficiency for public utility companies, better service for citizens and positive environment and health.Keywords: waste management, environment, collection route optimisation, GIS
Procedia PDF Downloads 16116972 Inter-Communication-Management in Cases with Disabled Children (ICDC)
Authors: Dena A. Hussain
Abstract:
The objective of this project is to design an Information and Communication Technologies (ICT) tool based on a standardized platform to assist the work-integrated learning process of caretakers of disabled children. The tool should assist the intercommunication between caretakers and improve the learning process through knowledge bridging between all involved caretakers. Some children are born with disabilities while others have special needs after an illness or accident. Special needs children often need help in their learning process and require tools and services in a different way. In some cases the child has multiple disabilities that affect several capabilities in different ways. These needs are to be transformed into different learning techniques that the staff or personal (called caretakers in this project) caring for the child needs to learn and adapt. The caretakers involved are also required to learn new learning or training techniques and utilities specialized for the child’s needs. In many cases the number of people caring for the child’s development is rather large; the parents, specialist pedagogues, teachers, therapists, psychologists, personal assistants, etc. Each group of specialists has different objectives and in some cases the merge between theses specifications is very unique. This makes the synchronization between different caretakers difficult, resulting often in low level cooperation. By better intercommunication between professions both the child’s development could be improved but also the caretakers’ methods and knowledge of each other’s work processes and their own profession. This introduces a unique work integrated learning environment for all personnel involve, merging learning and knowledge in the work environment and at the same time assist the children’s development process. Creating an iterative process generates a unique learning experience for all involved. Using a work integrated platform will help encourage and support the process of all the teams involved in the process.We believe that working with children who have special needs is a continues learning/working process that is always integrated to achieve one main goal, which is to make a better future for all children.Keywords: information and communication technologies (ICT), work integrated learning (WIL), sustainable learning, special needs children
Procedia PDF Downloads 29416971 Cultural Analysis of Dowry System with Relation to Women Prestige in District Swabi
Authors: Ullah Aman
Abstract:
The practice of giving dowry was meant to assist a newly-wed couple to start their life together with ease; however, now it has turned into a commercial transaction in which monetary considerations receive priority over the personal merits of the bride. The present study was designed to explore the causes and consequences of dowry system and to measure the association between dowry and women prestige in district Swabi. A sample size of 378 household (female) Respondents was randomly selected through proportional allocation method. The data was interpreted into frequency and percentages while to measure the relationship between Dowry system and Women Prestige, Chi-Square statistic was applied. Result indicated that majority of the 316(83.6%) respondents stated that Family members are in favor of high dowry because high dowry is today’s women need which is disclosed by 302(79.9%) of the sample size. In addition to this, most 320(84.7%) of the respondents has opined that low dowry leads to broken families in the society. Moreover, a strong association (p=0.000) was determined between high dowry and women prestige. Similarly, a strong significant relation was found (p=0.000) between women prestige and low dowry mortifying women prestige in our society. The study concluded that dowry, deeply rooted in the society is an immorality which must be strictly banned in the country. It is a herculean task to completely eliminate dowry system from the nation but slowly and gradually efforts are being made in this direction. Dowry is a sure cause of promoting intense conflicts between the families, quarrels and inculcates greed in the society. Government and the builders of the social fabric should strictly work on it, banning this system for each and every class in Pakistan. Moreover, for curbing this mal practice we must put effort to bring social awareness to the society are some of the recommendation.Keywords: Dowry, women, prestige, causes,
Procedia PDF Downloads 153