Search results for: sequential linear programming
3198 Two-Phase Sampling for Estimating a Finite Population Total in Presence of Missing Values
Authors: Daniel Fundi Murithi
Abstract:
Missing data is a real bane in many surveys. To overcome the problems caused by missing data, partial deletion, and single imputation methods, among others, have been proposed. However, problems such as discarding usable data and inaccuracy in reproducing known population parameters and standard errors are associated with them. For regression and stochastic imputation, it is assumed that there is a variable with complete cases to be used as a predictor in estimating missing values in the other variable, and the relationship between the two variables is linear, which might not be realistic in practice. In this project, we estimate population total in presence of missing values in two-phase sampling. Instead of regression or stochastic models, non-parametric model based regression model is used in imputing missing values. Empirical study showed that nonparametric model-based regression imputation is better in reproducing variance of population total estimate obtained when there were no missing values compared to mean, median, regression, and stochastic imputation methods. Although regression and stochastic imputation were better than nonparametric model-based imputation in reproducing population total estimates obtained when there were no missing values in one of the sample sizes considered, nonparametric model-based imputation may be used when the relationship between outcome and predictor variables is not linear.Keywords: finite population total, missing data, model-based imputation, two-phase sampling
Procedia PDF Downloads 1313197 Experiments on Weakly-Supervised Learning on Imperfect Data
Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler
Abstract:
Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation
Procedia PDF Downloads 1993196 An Overbooking Model for Car Rental Service with Different Types of Cars
Authors: Naragain Phumchusri, Kittitach Pongpairoj
Abstract:
Overbooking is a very useful revenue management technique that could help reduce costs caused by either undersales or oversales. In this paper, we propose an overbooking model for two types of cars that can minimize the total cost for car rental service. With two types of cars, there is an upgrade possibility for lower type to upper type. This makes the model more complex than one type of cars scenario. We have found that convexity can be proved in this case. Sensitivity analysis of the parameters is conducted to observe the effects of relevant parameters on the optimal solution. Model simplification is proposed using multiple linear regression analysis, which can help estimate the optimal overbooking level using appropriate independent variables. The results show that the overbooking level from multiple linear regression model is relatively close to the optimal solution (with the adjusted R-squared value of at least 72.8%). To evaluate the performance of the proposed model, the total cost was compared with the case where the decision maker uses a naïve method for the overbooking level. It was found that the total cost from optimal solution is only 0.5 to 1 percent (on average) lower than the cost from regression model, while it is approximately 67% lower than the cost obtained by the naïve method. It indicates that our proposed simplification method using regression analysis can effectively perform in estimating the overbooking level.Keywords: overbooking, car rental industry, revenue management, stochastic model
Procedia PDF Downloads 1723195 A Framework for SQL Learning: Linking Learning Taxonomy, Cognitive Model and Cross Cutting Factors
Authors: Huda Al Shuaily, Karen Renaud
Abstract:
Databases comprise the foundation of most software systems. System developers inevitably write code to query these databases. The de facto language for querying is SQL and this, consequently, is the default language taught by higher education institutions. There is evidence that learners find it hard to master SQL, harder than mastering other programming languages such as Java. Educators do not agree about explanations for this seeming anomaly. Further investigation may well reveal the reasons. In this paper, we report on our investigations into how novices learn SQL, the actual problems they experience when writing SQL, as well as the differences between expert and novice SQL query writers. We conclude by presenting a model of SQL learning that should inform the instructional material design process better to support the SQL learning process.Keywords: pattern, SQL, learning, model
Procedia PDF Downloads 2543194 FRP Bars Spacing Effect on Numerical Thermal Deformations in Concrete Beams under High Temperatures
Authors: A. Zaidi, F. Khelifi, R. Masmoudi, M. Bouhicha
Abstract:
5 In order to eradicate the degradation of reinforced concrete structures due to the steel corrosion, professionals in constructions suggest using fiber reinforced polymers (FRP) for their excellent properties. Nevertheless, high temperatures may affect the bond between FRP bar and concrete, and consequently the serviceability of FRP-reinforced concrete structures. This paper presents a nonlinear numerical investigation using ADINA software to investigate the effect of the spacing between glass FRP (GFRP) bars embedded in concrete on circumferential thermal deformations and the distribution of radial thermal cracks in reinforced concrete beams submitted to high temperature variations up to 60 °C for asymmetrical problems. The thermal deformations predicted from nonlinear finite elements model, at the FRP bar/concrete interface and at the external surface of concrete cover, were established as a function of the ratio of concrete cover thickness to FRP bar diameter (c/db) and the ratio of spacing between FRP bars in concrete to FRP bar diameter (e/db). Numerical results show that the circumferential thermal deformations at the external surface of concrete cover are linear until cracking thermal load varied from 32 to 55 °C corresponding to the ratio of e/db varied from 1.3 to 2.3, respectively. However, for ratios e/db >2.3 and c/db >1.6, the thermal deformations at the external surface of concrete cover exhibit linear behavior without any cracks observed on the specified surface. The numerical results are compared to those obtained from analytical models validated by experimental tests.Keywords: concrete beam, FRP bars, spacing effect, thermal deformation
Procedia PDF Downloads 2033193 Teaching Pragmatic Coherence in Literary Text: Analysis of Chimamanda Adichie’s Americanah
Authors: Joy Aworo-Okoroh
Abstract:
Literary texts are mirrors of a real-life situation. Thus, authors choose the linguistic items that would best encode their intended meanings and messages. However, words mean more than they seem. The meaning of words is not static rather, it is dynamic as they constantly enter into relationships within a context. Literary texts can only be meaningful if all pragmatic cues are identified and interpreted. Drawing upon Teun Van Djik's theory of local pragmatic coherence, it is established that words enter into relations in a text and these relations account for sequential speech acts in the texts. Comprehension of the text is dependent on the interpretation of these relations.To show the relevance of pragmatic coherence in literary text analysis, ten conversations were selected in Americanah in order to give a clear idea of the pragmatic relations used. The conversations were analysed, identifying the speech act and epistemic relations inherent in them. A subtle analysis of the structure of the conversations was also carried out. It was discovered that justification is the most commonly used relation and the meaning of the text is dependent on the interpretation of these instances' pragmatic coherence. The study concludes that to effectively teach literature in English, pragmatic coherence should be incorporated as words mean more than they say.Keywords: pragmatic coherence, epistemic coherence, speech act, Americanah
Procedia PDF Downloads 1363192 Inclined Convective Instability in a Porous Layer Saturated with Non-Newtonian Fluid
Authors: Rashmi Dubey
Abstract:
The study aims at investigating the onset of thermal convection in an inclined porous layer saturated with a non-Newtonian fluid. The layer is infinitely extended and has a finite width confined between two boundaries with constant pressure conditions, where the lower one is maintained at a higher temperature. Over the years, this area of research has attracted many scientists and researchers, for it has a plethora of applications in the fields of sciences and engineering, such as in civil engineering, geothermal sites, petroleum industries, etc.Considering the possibilities in a practical scenario, an inclined porous layer is considered, which can be used to develop a generalized model applicable to any inclination. Using the isobaric boundaries, the hydrodynamic boundary conditions are derived for the power-law model and are used to obtain the basic state flow. The convection in the basic state flow is driven by the thermal buoyancy in the flow system and is carried away further due to hydrodynamic boundaries. A linear stability analysis followed by a normal-mode analysis is done to investigate the onset of convection in the buoyancy-driven flow. The analysis shows that the convective instability is always initiated by the non-traveling modes for the Newtonian fluid, but prevails in the form of oscillatory modes, for up to a certain inclination of the porous layer. However, different behavior is observed for the dilatant and pseudoplastic fluids.Keywords: thermal convection, linear stability, porous media flow, Inclined porous layer
Procedia PDF Downloads 1233191 Integrated Formulation of Project Scheduling and Material Procurement Considering Different Discount Options
Authors: Babak H. Tabrizi, Seyed Farid Ghaderi
Abstract:
On-time availability of materials in the construction sites plays an outstanding role in successful achievement of project’s deliverables. Thus, this paper has investigated formulation of project scheduling and material procurement at the same time, by a mixed-integer programming model, aiming to minimize/maximize penalty/reward to deliver the project and minimize material holding, ordering, and procurement costs, respectively. We have taken both all-units and incremental discount possibilities into consideration to address more flexibility from the procurement side with regard to real world conditions. Finally, the applicability and efficiency of the mathematical model is tested by different numerical examples.Keywords: discount strategies, material purchasing, project planning, project scheduling
Procedia PDF Downloads 2613190 An Excel-Based Educational Platform for Design Analyses of Pump-Pipe Systems
Authors: Mohamed M. El-Awad
Abstract:
This paper describes an educational platform for design analyses of pump-pipe systems by using Microsoft Excel, its Solver add-in, and the associated VBA programming language. The paper demonstrates the capabilities of the Excel-based platform that suits the iterative nature of the design process better than the use of design charts and data tables. While VBA is used for the development of a user-defined function for determining the standard pipe diameter, Solver is used for optimising the pipe diameter of the pipeline and for determining the operating point of the selected pump.Keywords: design analyses, pump-pipe systems, Excel, solver, VBA
Procedia PDF Downloads 1663189 Research of Possibilities to Influence the Metal Cross-Section Deformation during Cold Rolling with the Help of Local Deformation Zone Creation
Authors: A. Pesin, D. Pustovoytov, A. Kolesnik, M. Sverdlik
Abstract:
Rolling disturbances often arise which might lead to defects such as nonflatness, warpage, corrugation, etc. Numerous methods of compensation for such disturbances are well known. However, most of them preserve the initial form of transverse flow of the strip, such as convex, concave or asymmetric (for example, sphenoid). Sometimes, the form inherited (especially asymmetric) is undesirable. Technical solutions have been developed which include providing conditions for transverse metal flow in deformation zone. It should be noted that greater reduction is followed by transverse flow increase, while less reduction causes a corresponding decrease in metal flow for differently deformed metal lengths to remain approximately the same and in order to avoid the defects mentioned above. One of the solutions suggests sequential strip deforming from rectangular cross-section profile with periodical rectangular grooves back into rectangular profile again. The work was carried out in DEFORM 3D program complex. Experimental rolling was performed on laboratory mill 150. Comparison of experimental and theoretical results demonstrated good correlation.Keywords: FEM, cross-section deformation, mechanical engineering, applied mechanics
Procedia PDF Downloads 3483188 Patient Reported Outcome Measures Post Implant Based Reconstruction Basildon Hospital
Authors: Danny Fraser, James Zhang
Abstract:
Aim of the study: Our study aims to identify any statistically significant evidence as it relates to PROMs for mastectomy and implant-based reconstruction to guide future surgical management. Method: The demographic, pre and post-operative treatment and implant characteristics were collected of all patients at Basildon hospital who underwent breast reconstruction from 2017-2023. We used the Breast-Q psychosocial well-being, physical well-being, and satisfaction with breasts scales. An Independent t-test was conducted for each group, and linear regression of age and implant size. Results: 69 patients were contacted, and 39 PROMs returned. The mean age of patients was 57.6. 40% had smoked before, and 40.8% had BMI>30. 29 had pre-pectoral placement, and 40 had subpectoral placement. 17 had smooth implants, and 52 textured. Sub pectoral placement was associated with higher (75.7 vs. 61.9 p=0.046) psychosocial scores than pre pectoral, and textured implants were associated with a lower physical score than the smooth surface (34.7 VS 50.2 P=0.046). On linear regression, age was positively associated (p=0.007) with psychosocial score. Conclusion: We present a large cohort of patients who underwent breast reconstruction. Understanding the PROMs of these procedures can guide clinicians, patients and policy makers to be more informed of the course of rehabilitation of these operations. Significance: We have found that from a patient perspective subpectoral implant placement was associated with a statistically significant improvement in psychosocial scores.Keywords: breast surgery, mastectomy, breast implants, oncology
Procedia PDF Downloads 613187 Deconstructing Local Area Networks Using MaatPeace
Authors: Gerald Todd
Abstract:
Recent advances in random epistemologies and ubiquitous theory have paved the way for web services. Given the current status of linear-time communication, cyberinformaticians compellingly desire the exploration of link-level acknowledgements. In order to realize this purpose, we concentrate our efforts on disconfirming that DHTs and model checking are mostly incompatible.Keywords: LAN, cyberinformatics, model checking, communication
Procedia PDF Downloads 4013186 Electric Vehicle Fleet Operators in the Energy Market - Feasibility and Effects on the Electricity Grid
Authors: Benjamin Blat Belmonte, Stephan Rinderknecht
Abstract:
The transition to electric vehicles (EVs) stands at the forefront of innovative strategies designed to address environmental concerns and reduce fossil fuel dependency. As the number of EVs on the roads increases, so too does the potential for their integration into energy markets. This research dives deep into the transformative possibilities of using electric vehicle fleets, specifically electric bus fleets, not just as consumers but as active participants in the energy market. This paper investigates the feasibility and grid effects of electric vehicle fleet operators in the energy market. Our objective centers around a comprehensive exploration of the sector coupling domain, with an emphasis on the economic potential in both electricity and balancing markets. Methodologically, our approach combines data mining techniques with thorough pre-processing, pulling from a rich repository of electricity and balancing market data. Our findings are grounded in the actual operational realities of the bus fleet operator in Darmstadt, Germany. We employ a Mixed Integer Linear Programming (MILP) approach, with the bulk of the computations being processed on the High-Performance Computing (HPC) platform ‘Lichtenbergcluster’. Our findings underscore the compelling economic potential of EV fleets in the energy market. With electric buses becoming more prevalent, the considerable size of these fleets, paired with their substantial battery capacity, opens up new horizons for energy market participation. Notably, our research reveals that economic viability is not the sole advantage. Participating actively in the energy market also translates into pronounced positive effects on grid stabilization. Essentially, EV fleet operators can serve a dual purpose: facilitating transport while simultaneously playing an instrumental role in enhancing grid reliability and resilience. This research highlights the symbiotic relationship between the growth of EV fleets and the stabilization of the energy grid. Such systems could lead to both commercial and ecological advantages, reinforcing the value of electric bus fleets in the broader landscape of sustainable energy solutions. In conclusion, the electrification of transport offers more than just a means to reduce local greenhouse gas emissions. By positioning electric vehicle fleet operators as active participants in the energy market, there lies a powerful opportunity to drive forward the energy transition. This study serves as a testament to the synergistic potential of EV fleets in bolstering both economic viability and grid stabilization, signaling a promising trajectory for future sector coupling endeavors.Keywords: electric vehicle fleet, sector coupling, optimization, electricity market, balancing market
Procedia PDF Downloads 743185 Simplified Linear Regression Model to Quantify the Thermal Resilience of Office Buildings in Three Different Power Outage Day Times
Authors: Nagham Ismail, Djamel Ouahrani
Abstract:
Thermal resilience in the built environment reflects the building's capacity to adapt to extreme climate changes. In hot climates, power outages in office buildings pose risks to the health and productivity of workers. Therefore, it is of interest to quantify the thermal resilience of office buildings by developing a user-friendly simplified model. This simplified model begins with creating an assessment metric of thermal resilience that measures the duration between the power outage and the point at which the thermal habitability condition is compromised, considering different power interruption times (morning, noon, and afternoon). In this context, energy simulations of an office building are conducted for Qatar's summer weather by changing different parameters that are related to the (i) wall characteristics, (ii) glazing characteristics, (iii) load, (iv) orientation and (v) air leakage. The simulation results are processed using SPSS to derive linear regression equations, aiding stakeholders in evaluating the performance of commercial buildings during different power interruption times. The findings reveal the significant influence of glazing characteristics on thermal resilience, with the morning power outage scenario posing the most detrimental impact in terms of the shortest duration before compromising thermal resilience.Keywords: thermal resilience, thermal envelope, energy modeling, building simulation, thermal comfort, power disruption, extreme weather
Procedia PDF Downloads 753184 Economic Analysis of a Carbon Abatement Technology
Authors: Hameed Rukayat Opeyemi, Pericles Pilidis Pagone Emmanuele, Agbadede Roupa, Allison Isaiah
Abstract:
Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero-emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, the current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbomachinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50% cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low-temperature heat exchanger LTHX (referred to by some authors as air preheater the mixed conductive membrane responsible for oxygen transfer and the high-temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout)–AZEP 85% (85% CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine–AZEP 85% (85% CO2 capture). This paper discusses monte carlo risk analysis of four possible layouts of the AZEP cycle.Keywords: gas turbine, global warming, green house gas, fossil fuel power plants
Procedia PDF Downloads 3973183 Neuron Efficiency in Fluid Dynamics and Prediction of Groundwater Reservoirs'' Properties Using Pattern Recognition
Authors: J. K. Adedeji, S. T. Ijatuyi
Abstract:
The application of neural network using pattern recognition to study the fluid dynamics and predict the groundwater reservoirs properties has been used in this research. The essential of geophysical survey using the manual methods has failed in basement environment, hence the need for an intelligent computing such as predicted from neural network is inevitable. A non-linear neural network with an XOR (exclusive OR) output of 8-bits configuration has been used in this research to predict the nature of groundwater reservoirs and fluid dynamics of a typical basement crystalline rock. The control variables are the apparent resistivity of weathered layer (p1), fractured layer (p2), and the depth (h), while the dependent variable is the flow parameter (F=λ). The algorithm that was used in training the neural network is the back-propagation coded in C++ language with 300 epoch runs. The neural network was very intelligent to map out the flow channels and detect how they behave to form viable storage within the strata. The neural network model showed that an important variable gr (gravitational resistance) can be deduced from the elevation and apparent resistivity pa. The model results from SPSS showed that the coefficients, a, b and c are statistically significant with reduced standard error at 5%.Keywords: gravitational resistance, neural network, non-linear, pattern recognition
Procedia PDF Downloads 2133182 Achievement of High L-Cysteine Yield from Enzymatic Conversion Using Eutectic Mixtures of the Substrate ATC
Authors: Deokyeong Choe, Sung Hun Youn, Younggon Kim, Chul Soo Shin
Abstract:
L-Cysteine, a sulfur-containing amino acid, has been often used in the pharmaceutical, cosmetic, food, and feed additive industries. This amino acid has been usually produced by acid-hydrolysis of human hair and poultry feathers. There are many problems, such as avoidance for use of animal hair, low yields, and formation of harmful waste material. As an alternative, the enzymatic conversion of D, L-2-amino-Δ2-thiazoline-4-carboxylic acid (ATC) to L-cysteine has been developed as an environmental-friendly method. However, the substrate solubility was too low to be used in industry. In this study, high concentrations of eutectic substrate solutions were prepared to solve the problem. Eutectic melting occurred at 39°C after mixing ATC and malonic acid at a molar ratio of 1:1. The characteristics of eutectic mixtures were analyzed by FE-SEM, EDS mapping, and XPS. However, since sorbitol, MnSO4, and NaOH should be added as supplements to the substrate mixture for the activation and stabilization of the enzyme, strategies for sequential addition of total five compounds, ATC, malonic acid, sorbitol, MnSO4, and NaOH were established. As a result, eutectic substrate mixtures of 670 mM ATC were successfully formulated. After 6 h of enzymatic reaction, 550 mM L-cysteine was made.Keywords: D, L-2-amino-Δ2-thiazoline-4-carboxylicacid, enzymatic conversion, eutectic solution, l-cysteine
Procedia PDF Downloads 4243181 A Mixed Methods Study Aimed at Exploring the Conceptualization of Orthorexia Nervosa on Instagram
Authors: Elena V. Syurina, Sophie Renckens, Martina Valente
Abstract:
Objective: The objective of this study was to investigate the nature of the conversation around orthorexia nervosa (ON) on Instagram. Methods: The present study was conducted using mixed methods, combining a concurrent triangulation and sequential explanatory design. First, 3027 pictures posted on Instagram using #Orthorexia were analyzed. Then, a questionnaire about Instagram use related to ON was completed entirely by 185 respondents. These two quantitative data sources were statistically analyzed and triangulated afterwards. Finally, 9 interviews were conducted, to more deeply investigate what is being said about ON on Instagram and what the motivations to post about it are. Results: Four main categories of pictures were found to be represented in Instagram posts about ON: ‘food’, ‘people’, ‘text’, and ‘other.’ Savory and unprocessed food was most highly represented within the food category, and pictures of people were mostly pictures of the account holder. People who self-identify as having ON were more likely to post about ON, and they were significantly more likely to post about ‘food’, ‘people’ and ‘text.’ The goal of the posts was to raise awareness around ON, as well as to provide support for people who believe to be suffering from it. Conclusion: Since the conversation around ON on Instagram is supportive, it could be beneficial to consider Instagram use in the treatment of ON. However, more research is needed on a larger scale.Keywords: orthorexia nervosa, Instagram, social media, disordered eating
Procedia PDF Downloads 1383180 Virtua-Gifted and Non-Gifted Students’ Motivation toward Virtual Flipped Learning from L2 Motivational Self-System Lense
Authors: Kamal Heidari
Abstract:
Covid-19 has borne drastic effects on different areas of society, including the education area, in that it brought virtual education to the center of attention, as an alternative to in-person education. In virtual education, the importance of flipped learning doubles, as students are supposed to take the main responsibility of teaching/learning process; and teachers play merely a facilitative/monitoring role. Given the students’ responsibility in virtual flipped learning, students’ motivation plays a pivotal role in the effectiveness of this learning method. The L2 Motivational Self-System (L2MSS) model is a currently proposed model elaborating on students’ motivation based on three sub-components: ideal L2 self, ought-to L2 self, and L2 learning experience. Drawing on an exploratory sequential mixed-methods research design, this study probed the effect of virtual flipped learning (via SHAD platform) on 112 gifted and non-gifted students’ motivation based on the L2 MSS. This study uncovered that notwithstanding the point that virtual flipped learning improved both gifted and non-gifted students’ motivation, it differentially affected their motivation. In other words, gifted students mostly referred to ideal L2 self, while non-gifted ones referred to ought-to L2 self and L2 learning experience aspects of motivation.Keywords: virtual flipped learning, giftedness, motivation, L2MSS
Procedia PDF Downloads 913179 Object-Oriented Modeling Simulation and Control of Activated Sludge Process
Authors: J. Fernandez de Canete, P. Del Saz Orozco, I. Garcia-Moral, A. Akhrymenka
Abstract:
Object-oriented modeling is spreading in current simulation of wastewater treatments plants through the use of the individual components of the process and its relations to define the underlying dynamic equations. In this paper, we describe the use of the free-software OpenModelica simulation environment for the object-oriented modeling of an activated sludge process under feedback control. The performance of the controlled system was analyzed both under normal conditions and in the presence of disturbances. The object-oriented described approach represents a valuable tool in teaching provides a practical insight in wastewater process control field.Keywords: object-oriented programming, activated sludge process, OpenModelica, feedback control
Procedia PDF Downloads 3863178 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R
Authors: Pavel H. Llamocca, Victoria Lopez
Abstract:
The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.Keywords: open data, R language, data integration, environmental data
Procedia PDF Downloads 3153177 The Dynamics of a 3D Vibrating and Rotating Disc Gyroscope
Authors: Getachew T. Sedebo, Stephan V. Joubert, Michael Y. Shatalov
Abstract:
Conventional configuration of the vibratory disc gyroscope is based on in-plane non-axisymmetric vibrations of the disc with a prescribed circumferential wave number. Due to the Bryan's effect, the vibrating pattern of the disc becomes sensitive to the axial component of inertial rotation of the disc. Rotation of the vibrating pattern relative to the disc is proportional to the inertial angular rate and is measured by sensors. In the present paper, the authors investigate a possibility of making a 3D sensor on the basis of both in-plane and bending vibrations of the disc resonator. We derive equations of motion for the disc vibratory gyroscope, where both in-plane and bending vibrations are considered. Hamiltonian variational principle is used in setting up equations of motion and the corresponding boundary conditions. The theory of thin shells with the linear elasticity principles is used in formulating the problem and also the disc is assumed to be isotropic and obeys Hooke's Law. The governing equation for a specific mode is converted to an ODE to determine the eigenfunction. The resulting ODE has exact solution as a linear combination of Bessel and Neumann functions. We demonstrate how to obtain an explicit solution and hence the eigenvalues and corresponding eigenfunctions for annular disc with fixed inner boundary and free outer boundary. Finally, the characteristics equations are obtained and the corresponding eigenvalues are calculated. The eigenvalues are used for the calculation of tuning conditions of the 3D disc vibratory gyroscope.Keywords: Bryan’s effect, bending vibrations, disc gyroscope, eigenfunctions, eigenvalues, tuning conditions
Procedia PDF Downloads 3223176 The Use of Geographic Information System and Spatial Statistic for Analyzing Leukemia in Kuwait for the Period of 2006-2012
Authors: Muhammad G. Almatar, Mohammad A. Alnasrallah
Abstract:
This research focuses on the study of three main issues: 1) The temporal analysis of leukemia for a period of six years (2006-2012), 2) spatial analysis by investigating this phenomenon in the Kuwaiti society spatially in the residential areas within the six governorates, 3) the use of Geographic Information System technology in investigating the hypothesis of the research and its variables using the linear regression, to show the pattern of linear relationship. The study depends on utilizing the map to understand the distribution of blood cancer in Kuwait. Several geodatabases were created for the number of patients and air pollution. Spatial interpolation models were used to generate layers of air pollution in the study area. These geodatabases were tested over the past six years to reach the conclusion: Is there a relationship with significant significance between the two main variables of the study: blood cancer and air pollution? This study is the first to our best knowledge. As far as the researchers know, the distribution of this disease has not been studied geographically at the level of regions in Kuwait within six years and in specific areas as described above. This study investigates the concentration of this type of disease. The study found that there is no relationship of significant value between the two variables studied, and this may be due to the nature of the disease, which are often hereditary. On the other hand, this study has reached a number of suggestions and recommendations that may be useful to decision-makers and interested in the study of leukemia in Kuwait by focusing on the study of genetic diseases, which may be a cause of leukemia rather than air pollution.Keywords: Kuwait, GIS, cancer, geography
Procedia PDF Downloads 1143175 Highly Accurate Tennis Ball Throwing Machine with Intelligent Control
Authors: Ferenc Kovács, Gábor Hosszú
Abstract:
The paper presents an advanced control system for tennis ball throwing machines to improve their accuracy according to the ball impact points. A further advantage of the system is the much easier calibration process involving the intelligent solution of the automatic adjustment of the stroking parameters according to the ball elasticity, the self-calibration, the use of the safety margin at very flat strokes and the possibility to placing the machine to any position of the half court. The system applies mathematical methods to determine the exact ball trajectories and special approximating processes to access all points on the aimed half court.Keywords: control system, robot programming, robot control, sports equipment, throwing machine
Procedia PDF Downloads 3973174 Knowledge of Strategies to Teach Reading Components Among Teachers of Hard of Hearing Students
Authors: Khalid Alasim
Abstract:
This study investigated Saudi Arabian elementary school teachers’ knowledge of strategies to teach reading components to hard-of-hearing students. The study focused on four of the five reading components the National Reading Panel (NPR, 2000) identified: phonemic awareness; phonics; vocabulary, and reading comprehension, and explored the relationship between teachers’ demographic characteristics and their knowledge of the strategies as well. An explanatory sequential mixed methods design was used that included two phases. The quantitative phase examined the knowledge of these Arabic reading components among 89 elementary school teachers of hard-of-hearing students, and the qualitative phase consisted of interviews with 10 teachers. The results indicated that the teachers have a great deal of knowledge (above the mean score) of strategies to teach reading components. Specifically, teachers’ knowledge of strategies to teach the vocabulary component was the highest. The results also showed no significant association between teachers’ demographic characteristics and their knowledge of strategies to teach reading components. The qualitative analysis revealed two themes: 1) teachers’ lack of basic knowledge of strategies to teach reading components, and 2) the absence of in-service courses and training programs in reading for teachers.Keywords: knowledge, reading, components, hard-of-hearing, phonology, vocabulary
Procedia PDF Downloads 803173 Optimal Control of Volterra Integro-Differential Systems Based on Legendre Wavelets and Collocation Method
Authors: Khosrow Maleknejad, Asyieh Ebrahimzadeh
Abstract:
In this paper, the numerical solution of optimal control problem (OCP) for systems governed by Volterra integro-differential (VID) equation is considered. The method is developed by means of the Legendre wavelet approximation and collocation method. The properties of Legendre wavelet accompany with Gaussian integration method are utilized to reduce the problem to the solution of nonlinear programming one. Some numerical examples are given to confirm the accuracy and ease of implementation of the method.Keywords: collocation method, Legendre wavelet, optimal control, Volterra integro-differential equation
Procedia PDF Downloads 3883172 Correlation between Resistance to Non-Specific Inhibitor and Mammalian Pathogenicity of an Egg Adapted H9N2 Virus
Authors: Chung-Young Lee, Se-Hee Ahn, Jun-Gu Choi, Youn-Jeong Lee, Hyuk-Joon Kwon, Jae-Hong Kim
Abstract:
A/chicken/Korea/01310/2001 (H9N2) (01310) was passaged through embryonated chicken eggs (ECEs) by 20 times (01310-E20), and it has been used for an inactivated oil emulsion vaccine in Korea. After sequential passages, 01310-E20 showed higher pathogenicity in ECEs and acquired multiple mutations including a potential N-glycosylation at position 133 (H3 numbering) in HA and 18aa-deletion in NA stalk. To evaluate the effect of these mutations on the mammalian pathogenicity and resistance to non-specific inhibitors, we generated four PR8-derived recombinant viruses with different combinations of HA and NA from 01310-E2 and 01310-E20 (rH2N2, rH2N20, rH20N2, and rH20N20). According to our results, recombinant viruses containing 01310 E20 HA showed higher growth property in MDCK cells and higher virulence on mice than those containing 01310 E2 HA regardless of NA. The hemagglutination activity of rH20N20 was less inhibited by egg white and mouse lung extract than that of other recombinant viruses. Thus, the increased pathogenicity of 01310-E20 may be related to both higher replication efficiency and resistance to non-specific inhibitors in mice.Keywords: avian influenza virus, egg adaptation, H9N2, N-glycosylation, stalk deletion of neuraminidase
Procedia PDF Downloads 2873171 Understanding Hydrodynamic in Lake Victoria Basin in a Catchment Scale: A Literature Review
Authors: Seema Paul, John Mango Magero, Prosun Bhattacharya, Zahra Kalantari, Steve W. Lyon
Abstract:
The purpose of this review paper is to develop an understanding of lake hydrodynamics and the potential climate impact on the Lake Victoria (LV) catchment scale. This paper briefly discusses the main problems of lake hydrodynamics and its’ solutions that are related to quality assessment and climate effect. An empirical methodology in modeling and mapping have considered for understanding lake hydrodynamic and visualizing the long-term observational daily, monthly, and yearly mean dataset results by using geographical information system (GIS) and Comsol techniques. Data were obtained for the whole lake and five different meteorological stations, and several geoprocessing tools with spatial analysis are considered to produce results. The linear regression analyses were developed to build climate scenarios and a linear trend on lake rainfall data for a long period. A potential evapotranspiration rate has been described by the MODIS and the Thornthwaite method. The rainfall effect on lake water level observed by Partial Differential Equations (PDE), and water quality has manifested by a few nutrients parameters. The study revealed monthly and yearly rainfall varies with monthly and yearly maximum and minimum temperatures, and the rainfall is high during cool years and the temperature is high associated with below and average rainfall patterns. Rising temperatures are likely to accelerate evapotranspiration rates and more evapotranspiration is likely to lead to more rainfall, drought is more correlated with temperature and cloud is more correlated with rainfall. There is a trend in lake rainfall and long-time rainfall on the lake water surface has affected the lake level. The onshore and offshore have been concentrated by initial literature nutrients data. The study recommended that further studies should consider fully lake bathymetry development with flow analysis and its’ water balance, hydro-meteorological processes, solute transport, wind hydrodynamics, pollution and eutrophication these are crucial for lake water quality, climate impact assessment, and water sustainability.Keywords: climograph, climate scenarios, evapotranspiration, linear trend flow, rainfall event on LV, concentration
Procedia PDF Downloads 993170 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs
Authors: Taysir Soliman
Abstract:
One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph
Procedia PDF Downloads 1893169 Monte Carlo Risk Analysis of a Carbon Abatement Technology
Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele
Abstract:
Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5 cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbo machinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50 % cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low temperature heat exchanger LTHX (referred to by some authors as air pre-heater the mixed conductive membrane responsible for oxygen transfer and the high temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. This paper discusses techno-economic analysis of four possible layouts of the AZEP cycle. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout) – AZEP 85 % (85 % CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine– AZEP 85 % (85 % CO2 capture). This paper discusses Montecarlo risk analysis of four possible layouts of the AZEP cycle.Keywords: gas turbine, global warming, green house gases, power plants
Procedia PDF Downloads 471