Search results for: quadratic programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1079

Search results for: quadratic programming

509 Portfolio Selection with Constraints on Trading Frequency

Authors: Min Dai, Hong Liu, Shuaijie Qian

Abstract:

We study a portfolio selection problem of an investor who faces constraints on rebalancing frequency, which is common in pension fund investment. We formulate it as a multiple optimal stopping problem and utilize the dynamic programming principle. By numerically solving the corresponding Hamilton-Jacobi-Bellman (HJB) equation, we find a series of free boundaries characterizing optimal strategy, and the constraints significantly impact the optimal strategy. Even in the absence of transaction costs, there is a no-trading region, depending on the number of the remaining trading chances. We also find that the equivalent wealth loss caused by the constraints is large. In conclusion, our model clarifies the impact of the constraints on transaction frequency on the optimal strategy.

Keywords: portfolio selection, rebalancing frequency, optimal strategy, free boundary, optimal stopping

Procedia PDF Downloads 61
508 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 179
507 A Design System for Complex Profiles of Machine Members Using a Synthetic Curve

Authors: N. Sateesh, C. S. P. Rao, K. Satyanarayana, C. Rajashekar

Abstract:

This paper proposes a development of a CAD/CAM system for complex profiles of various machine members using a synthetic curve i.e. B-spline. Conventional methods in designing and manufacturing of complex profiles are tedious and time consuming. Even programming those on a computer numerical control (CNC) machine can be a difficult job because of the complexity of the profiles. The system developed provides graphical and numerical representation B-spline profile for any given input. In this paper, the system is applicable to represent a cam profile with B-spline and attempt is made to improve the follower motion.

Keywords: plate-cams, cam profile, b-spline, computer numerical control (CNC), computer aided design and computer aided manufacturing (CAD/CAM), R-D-R-D (rise-dwell-return-dwell)

Procedia PDF Downloads 577
506 Response Surface Methodology Approach to Defining Ultrafiltration of Steepwater from Corn Starch Industry

Authors: Zita I. Šereš, Ljubica P. Dokić, Dragana M. Šoronja Simović, Cecilia Hodur, Zsuzsanna Laszlo, Ivana Nikolić, Nikola Maravić

Abstract:

In this work the concentration of steep-water from corn starch industry is monitored using ultrafiltration membrane. The aim was to examine the conditions of ultrafiltration of steep-water by applying the membrane of 2.5nm. The parameters that vary during the course of ultrafiltration, were the transmembrane pressure, flow rate, while the permeate flux and the dry matter content of permeate and retentive were the dependent parameter constantly monitored during the process. Experiments of ultrafiltration are conducted on the samples of steep-water, which were obtained from the starch wet milling plant Jabuka, Pancevo. The procedure of ultrafiltration on a single-channel 250mm length, with inner diameter of 6.8mm and outer diameter of 10mm membrane were carried on. The membrane is made of a-Al2O3 with TiO2 layer obtained from GEA (Germany). The experiments are carried out at a flow rate ranging from 100 to 200lh-1 and transmembrane pressure of 1-3 bars. During the experiments of steep-water ultrafiltration, the change of permeate flux, dry matter content of permeate and retentive, as well as the absorbance changes of the permeate and retentive were monitored. The experimental results showed that the maximum flux reaches about 40lm-2h-1. For responses obtained after experiments, a polynomial model of the second degree is established to evaluate and quantify the influence of the variables. The quadratic equitation fits with the experimental values, where the coefficient of determination for flux is 0.96. The dry matter content of the retentive is increased for about 6%, while the dry matter content of permeate was reduced for about 35-40%, respectively. During steep-water ultrafiltration in permeate stays 40% less dry matter compared to the feed.

Keywords: ultrafiltration, steep-water, starch industry, ceramic membrane

Procedia PDF Downloads 259
505 A Fuzzy Linear Regression Model Based on Dissemblance Index

Authors: Shih-Pin Chen, Shih-Syuan You

Abstract:

Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.

Keywords: dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming

Procedia PDF Downloads 412
504 Parameter Optimization and Thermal Simulation in Laser Joining of Coach Peel Panels of Dissimilar Materials

Authors: Masoud Mohammadpour, Blair Carlson, Radovan Kovacevic

Abstract:

The quality of laser welded-brazed (LWB) joints were strongly dependent on the main process parameters, therefore the effect of laser power (3.2–4 kW), welding speed (60–80 mm/s) and wire feed rate (70–90 mm/s) on mechanical strength and surface roughness were investigated in this study. The comprehensive optimization process by means of response surface methodology (RSM) and desirability function was used for multi-criteria optimization. The experiments were planned based on Box– Behnken design implementing linear and quadratic polynomial equations for predicting the desired output properties. Finally, validation experiments were conducted on an optimized process condition which exhibited good agreement between the predicted and experimental results. AlSi3Mn1 was selected as the filler material for joining aluminum alloy 6022 and hot-dip galvanized steel in coach peel configuration. The high scanning speed could control the thickness of IMC as thin as 5 µm. The thermal simulations of joining process were conducted by the Finite Element Method (FEM), and results were validated through experimental data. The Fe/Al interfacial thermal history evidenced that the duration of critical temperature range (700–900 °C) in this high scanning speed process was less than 1 s. This short interaction time leads to the formation of reaction-control IMC layer instead of diffusion-control mechanisms.

Keywords: laser welding-brazing, finite element, response surface methodology (RSM), multi-response optimization, cross-beam laser

Procedia PDF Downloads 333
503 Measuring Energy Efficiency Performance of Mena Countries

Authors: Azam Mohammadbagheri, Bahram Fathi

Abstract:

DEA has become a very popular method of performance measure, but it still suffers from some shortcomings. One of these shortcomings is the issue of having multiple optimal solutions to weights for efficient DMUs. The cross efficiency evaluation as an extension of DEA is proposed to avoid this problem. Lam (2010) is also proposed a mixed-integer linear programming formulation based on linear discriminate analysis and super efficiency method (MILP model) to avoid having multiple optimal solutions to weights. In this study, we modified MILP model to determine more suitable weight sets and also evaluate the energy efficiency of MENA countries as an application of the proposed model.

Keywords: data envelopment analysis, discriminate analysis, cross efficiency, MILP model

Procedia PDF Downloads 666
502 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms

Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy

Abstract:

Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.

Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu

Procedia PDF Downloads 460
501 Optimum Design of Dual-Purpose Outriggers in Tall Buildings

Authors: Jiwon Park, Jihae Hur, Kukjae Kim, Hansoo Kim

Abstract:

In this study, outriggers, which are horizontal structures connecting a building core to distant columns to increase the lateral stiffness of a tall building, are used to reduce differential axial shortening in a tall building. Therefore, the outriggers in tall buildings are used to serve the dual purposes of reducing the lateral displacement and reducing the differential axial shortening. Since the location of the outrigger greatly affects the effectiveness of the outrigger in terms of the lateral displacement at the top of the tall building and the maximum differential axial shortening, the optimum locations of the dual-purpose outriggers can be determined by an optimization method. Because the floors where the outriggers are installed are given as integer numbers, the conventional gradient-based optimization methods cannot be directly used. In this study, a piecewise quadratic interpolation method is used to resolve the integrality requirement posed by the optimum locations of the dual-purpose outriggers. The optimal solutions for the dual-purpose outriggers are searched by linear scalarization which is a popular method for multi-objective optimization problems. It was found that increasing the number of outriggers reduced the maximum lateral displacement and the maximum differential axial shortening. It was also noted that the optimum locations for reducing the lateral displacement and reducing the differential axial shortening were different. Acknowledgment: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science and ICT (NRF-2017R1A2B4010043) and financially supported by Korea Ministry of Land, Infrastructure and Transport(MOLIT) as U-City Master and Doctor Course Grant Program.

Keywords: concrete structure, optimization, outrigger, tall building

Procedia PDF Downloads 152
500 Airport Investment Risk Assessment under Uncertainty

Authors: Elena M. Capitanul, Carlos A. Nunes Cosenza, Walid El Moudani, Felix Mora Camino

Abstract:

The construction of a new airport or the extension of an existing one requires massive investments and many times public private partnerships were considered in order to make feasible such projects. One characteristic of these projects is uncertainty with respect to financial and environmental impacts on the medium to long term. Another one is the multistage nature of these types of projects. While many airport development projects have been a success, some others have turned into a nightmare for their promoters. This communication puts forward a new approach for airport investment risk assessment. The approach takes explicitly into account the degree of uncertainty in activity levels prediction and proposes milestones for the different stages of the project for minimizing risk. Uncertainty is represented through fuzzy dual theory and risk management is performed using dynamic programming. An illustration of the proposed approach is provided.

Keywords: airports, fuzzy logic, risk, uncertainty

Procedia PDF Downloads 376
499 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects

Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town

Abstract:

The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.

Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry

Procedia PDF Downloads 61
498 Numerical Calculation and Analysis of Fine Echo Characteristics of Underwater Hemispherical Cylindrical Shell

Authors: Hongjian Jia

Abstract:

A finite-length cylindrical shell with a spherical cap is a typical engineering approximation model of actual underwater targets. The research on the omni-directional acoustic scattering characteristics of this target model can provide a favorable basis for the detection and identification of actual underwater targets. The elastic resonance characteristics of the target are the results of the comprehensive effect of the target length, shell-thickness ratio and materials. Under the conditions of different materials and geometric dimensions, the coincidence resonance characteristics of the target have obvious differences. Aiming at this problem, this paper obtains the omni-directional acoustic scattering field of the underwater hemispherical cylindrical shell by numerical calculation and studies the influence of target geometric parameters (length, shell-thickness ratio) and material parameters on the coincidence resonance characteristics of the target in turn. The study found that the formant interval is not a stable value and changes with the incident angle. Among them, the formant interval is less affected by the target length and shell-thickness ratio and is significantly affected by the material properties, which is an effective feature for classifying and identifying targets of different materials. The quadratic polynomial is utilized to fully fit the change relationship between the formant interval and the angle. The results show that the three fitting coefficients of the stainless steel and aluminum targets are significantly different, which can be used as an effective feature parameter to characterize the target materials.

Keywords: hemispherical cylindrical shell;, fine echo characteristics;, geometric and material parameters;, formant interval

Procedia PDF Downloads 74
497 Electrical Load Estimation Using Estimated Fuzzy Linear Parameters

Authors: Bader Alkandari, Jamal Y. Madouh, Ahmad M. Alkandari, Anwar A. Alnaqi

Abstract:

A new formulation of fuzzy linear estimation problem is presented. It is formulated as a linear programming problem. The objective is to minimize the spread of the data points, taking into consideration the type of the membership function of the fuzzy parameters to satisfy the constraints on each measurement point and to insure that the original membership is included in the estimated membership. Different models are developed for a fuzzy triangular membership. The proposed models are applied to different examples from the area of fuzzy linear regression and finally to different examples for estimating the electrical load on a busbar. It had been found that the proposed technique is more suited for electrical load estimation, since the nature of the load is characterized by the uncertainty and vagueness.

Keywords: fuzzy regression, load estimation, fuzzy linear parameters, electrical load estimation

Procedia PDF Downloads 514
496 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning

Authors: Chandan Hegde, K. Ashwini

Abstract:

Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.

Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning

Procedia PDF Downloads 160
495 Some Pertinent Issues and Considerations on CBSE

Authors: Anil Kumar Tripathi, Ratneshwer Gupta

Abstract:

All the software engineering researches and best industry practices aim at providing software products with high degree of quality and functionality at low cost and less time. These requirements are addressed by the Component Based Software Engineering (CBSE) as well. CBSE, which deals with the software construction by components’ assembly, is a revolutionary extension of Software Engineering. CBSE must define and describe processes to assure timely completion of high quality software systems that are composed of a variety of pre built software components. Though these features provide distinct and visible benefits in software design and programming, they also raise some challenging problems. The aim of this work is to summarize the pertinent issues and considerations in CBSE to make an understanding in forms of concepts and observations that may lead to development of newer ways of dealing with the problems and challenges in CBSE.

Keywords: software component, component based software engineering, software process, testing, maintenance

Procedia PDF Downloads 377
494 Mathematical Modeling and Algorithms for the Capacitated Facility Location and Allocation Problem with Emission Restriction

Authors: Sagar Hedaoo, Fazle Baki, Ahmed Azab

Abstract:

In supply chain management, network design for scalable manufacturing facilities is an emerging field of research. Facility location allocation assigns facilities to customers to optimize the overall cost of the supply chain. To further optimize the costs, capacities of these facilities can be changed in accordance with customer demands. A mathematical model is formulated to fully express the problem at hand and to solve small-to-mid range instances. A dedicated constraint has been developed to restrict emissions in line with the Kyoto protocol. This problem is NP-Hard; hence, a simulated annealing metaheuristic has been developed to solve larger instances. A case study on the USA-Canada cross border crossing is used.

Keywords: emission, mixed integer linear programming, metaheuristic, simulated annealing

Procedia PDF Downloads 286
493 Intelligent Rescheduling Trains for Air Pollution Management

Authors: Kainat Affrin, P. Reshma, G. Narendra Kumar

Abstract:

Optimization of timetable is the need of the day for the rescheduling and routing of trains in real time. Trains are scheduled in parallel with the road transport vehicles to the same destination. As the number of trains is restricted due to single track, customers usually opt for road transport to use frequently. The air pollution increases as the density of vehicles on road transport is increased. Use of an alternate mode of transport like train helps in reducing air-pollution. This paper mainly aims at attracting the passengers to Train transport by proper rescheduling of trains using hybrid of stop-skip algorithm and iterative convex programming algorithm. Rescheduling of train bi-directionally is achieved on a single track with dynamic dual time and varying stops. Introduction of more trains attract customers to use rail transport frequently, thereby decreasing the pollution. The results are simulated using Network Simulator (NS-2).

Keywords: air pollution, AODV, re-scheduling, WSNs

Procedia PDF Downloads 334
492 Object-Oriented Program Comprehension by Identification of Software Components and Their Connexions

Authors: Abdelhak-Djamel Seriai, Selim Kebir, Allaoua Chaoui

Abstract:

During the last decades, object oriented program- ming has been massively used to build large-scale systems. However, evolution and maintenance of such systems become a laborious task because of the lack of object oriented programming to offer a precise view of the functional building blocks of the system. This lack is caused by the fine granularity of classes and objects. In this paper, we use a post object-oriented technology namely software components, to propose an approach based on the identification of the functional building blocks of an object oriented system by analyzing its source code. These functional blocks are specified as software components and the result is a multi-layer component based software architecture.

Keywords: software comprehension, software component, object oriented, software architecture, reverse engineering

Procedia PDF Downloads 388
491 Working Effectively with Muslim Communities in the West

Authors: Lisa Tribuzio

Abstract:

This paper explores the complexity of working with Muslim communities in Australia. It will draw upon the notions of belonging, social inclusion and effective community programming to engage Muslim communities in Western environments given the current global political climate. Factors taken into consideration for effective engagement include: family engagement, considering key practices such as Ramadan, fasting and prayer and food requirements, gender relations, core values around faith and spirituality, considering attitudes towards self disclosure in a counseling setting and the notion of Us and Them in the media and systems and its effect on minority communities. It will explore recent research in the field from Australian researchers as well as recommendations from United Nations in working with Muslim communities. It will also explore current practice models applied in Australia in engaging effectively with diverse communities and addressing racism and discrimination in innovative ways.

Keywords: Muslim, cultural diversity, social inclusion, racism

Procedia PDF Downloads 394
490 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates

Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim

Abstract:

The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.

Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM

Procedia PDF Downloads 378
489 Investigating Students' Understanding about Mathematical Concept through Concept Map

Authors: Rizky Oktaviana

Abstract:

The main purpose of studying lies in improving students’ understanding. Teachers usually use written test to measure students’ understanding about learning material especially mathematical learning material. This common method actually has a lack point, such that in mathematics content, written test only show procedural steps to solve mathematical problems. Therefore, teachers unable to see whether students actually understand about mathematical concepts and the relation between concepts or not. One of the best tools to observe students’ understanding about the mathematical concepts is concept map. The goal of this research is to describe junior high school students understanding about mathematical concepts through Concept Maps based on the difference of mathematical ability. There were three steps in this research; the first step was choosing the research subjects by giving mathematical ability test to students. The subjects of this research are three students with difference mathematical ability, high, intermediate and low mathematical ability. The second step was giving concept mapping training to the chosen subjects. The last step was giving concept mapping task about the function to the subjects. Nodes which are the representation of concepts of function were provided in concept mapping task. The subjects had to use the nodes in concept mapping. Based on data analysis, the result of this research shows that subject with high mathematical ability has formal understanding, due to that subject could see the connection between concepts of function and arranged the concepts become concept map with valid hierarchy. Subject with intermediate mathematical ability has relational understanding, because subject could arranged all the given concepts and gave appropriate label between concepts though it did not represent the connection specifically yet. Whereas subject with low mathematical ability has poor understanding about function, it can be seen from the concept map which is only used few of the given concepts because subject could not see the connection between concepts. All subjects have instrumental understanding for the relation between linear function concept, quadratic function concept and domain, co domain, range.

Keywords: concept map, concept mapping, mathematical concepts, understanding

Procedia PDF Downloads 250
488 Cost-Effective Hybrid Cloud Framework for HEI’s

Authors: Shah Muhammad Butt, Ahmed Masaud Ansari

Abstract:

Present Financial crisis in Higher Educational Institutes (HEIs) facing lots of problems considerable budget cuts, make difficult to meet the ever growing IT-based research and learning needs, institutions are rapidly planning and promoting cloud-based approaches for their academic and research needs. A cost effective Hybrid Cloud framework for HEI’s will provide educational services for campus or intercampus communication. Hybrid Cloud Framework comprises Private and Public Cloud approaches. This paper will propose the framework based on the Open Source Cloud (OpenNebula for Virtualization, Eucalyptus for Infrastructure, and Aneka for programming development environment) combined with CSP’s services which are delivered to the end-user via the Internet from public clouds.

Keywords: educational services, hybrid campus cloud, open source, electrical and systems sciences

Procedia PDF Downloads 428
487 Wearable Music: Generation of Costumes from Music and Generative Art and Wearing Them by 3-Way Projectors

Authors: Noriki Amano

Abstract:

The final goal of this study is to create another way in which people enjoy music through the performance of 'Wearable Music'. Concretely speaking, we generate colorful costumes in real- time from music and to realize their dressing by projecting them to a person. For this purpose, we propose three methods in this study. First, a method of giving color to music in a three-dimensionally way. Second, a method of generating images of costumes from music. Third, a method of wearing the images of music. In particular, this study stands out from other related work in that we generate images of unique costumes from music and realize to wear them. In this study, we use the technique of generative arts to generate images of unique costumes and project the images to the fog generated around a person from 3-way using projectors. From this study, we can get how to enjoy music as 'wearable'. Furthermore, we are also able to have the prospect of unconventional entertainment based on the fusion between music and costumes.

Keywords: entertainment computing, costumes, music, generative programming

Procedia PDF Downloads 150
486 Solid State Fermentation Process Development for Trichoderma asperellum Using Inert Support in a Fixed Bed Fermenter

Authors: Mauricio Cruz, Andrés Díaz García, Martha Isabel Gómez, Juan Carlos Serrato Bermúdez

Abstract:

The disadvantages of using natural substrates in SSF processes have been well recognized and mainly are associated to gradual decomposition of the substrate, formation of agglomerates and decrease of porosity bed generating limitations in the mass and heat transfer. Additionally, in several cases, materials with a high agricultural value such as sour milk, beets, rice, beans and corn have been used. Thus, the use of economic inert supports (natural or synthetic) in combination with a nutrient suspension for the production of biocontrol microorganisms is a good alternative in SSF processes, but requires further studies in the fields of modeling and optimization. Therefore, the aim of this work is to compare the performance of two inert supports, a synthetic (polyurethane foam) and a natural one (rice husk), identifying the factors that have the major effects on the productivity of T. asperellum Th204 and the maximum specific growth rate in a PROPHYTA L05® fixed bed bioreactor. For this, the six factors C:N ratio, temperature, inoculation rate, bed height, air moisture content and airflow were evaluated using a fractional design. The factors C:N and air flow were identified as significant on the productivity (expressed as conidia/dry substrate•h). The polyurethane foam showed higher maximum specific growth rate (0.1631 h-1) and productivities of 3.89 x107 conidia/dry substrate•h compared to rice husk (2.83x106) and natural substrate based on rice (8.87x106) used as control. Finally, a quadratic model was generated and validated, obtaining productivities higher than 3.0x107 conidia/dry substrate•h with air flow at 0.9 m3/h and C:N ratio at 18.1.

Keywords: bioprocess, scale up, fractional design, C:N ratio, air flow

Procedia PDF Downloads 484
485 Meta Model for Optimum Design Objective Function of Steel Frames Subjected to Seismic Loads

Authors: Salah R. Al Zaidee, Ali S. Mahdi

Abstract:

Except for simple problems of statically determinate structures, optimum design problems in structural engineering have implicit objective functions where structural analysis and design are essential within each searching loop. With these implicit functions, the structural engineer is usually enforced to write his/her own computer code for analysis, design, and searching for optimum design among many feasible candidates and cannot take advantage of available software for structural analysis, design, and searching for the optimum solution. The meta-model is a regression model used to transform an implicit objective function into objective one and leads in turn to decouple the structural analysis and design processes from the optimum searching process. With the meta-model, well-known software for structural analysis and design can be used in sequence with optimum searching software. In this paper, the meta-model has been used to develop an explicit objective function for plane steel frames subjected to dead, live, and seismic forces. Frame topology is assumed as predefined based on architectural and functional requirements. Columns and beams sections and different connections details are the main design variables in this study. Columns and beams are grouped to reduce the number of design variables and to make the problem similar to that adopted in engineering practice. Data for the implicit objective function have been generated based on analysis and assessment for many design proposals with CSI SAP software. These data have been used later in SPSS software to develop a pure quadratic nonlinear regression model for the explicit objective function. Good correlations with a coefficient, R2, in the range from 0.88 to 0.99 have been noted between the original implicit functions and the corresponding explicit functions generated with meta-model.

Keywords: meta-modal, objective function, steel frames, seismic analysis, design

Procedia PDF Downloads 221
484 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion

Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut

Abstract:

This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.

Keywords: hub location problem, p-hub median problem, clustering, congestion

Procedia PDF Downloads 462
483 Coordinated Voltage Control in Radial Distribution System with Distributed Generators Using Sensitivity Analysis

Authors: Anubhav Shrivastava Shivarudraswamy, Bhat Lakshya

Abstract:

Distributed generation has indeed become a major area of interest in recent years. Distributed generation can address a large number of loads in a power line and hence has better efficiency over the conventional methods. However, there are certain drawbacks associated with it, an increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/- 5% of the base value even after the introduction of DGs. Three methods for regulation of voltage are discussed. A sensitivity based analysis is then carried out to determine the priority among the various methods listed in the paper.

Keywords: distributed generators, distributed system, reactive power, voltage control, sensitivity analysis

Procedia PDF Downloads 631
482 The Boundary Element Method in Excel for Teaching Vector Calculus and Simulation

Authors: Stephen Kirkup

Abstract:

This paper discusses the implementation of the boundary element method (BEM) on an Excel spreadsheet and how it can be used in teaching vector calculus and simulation. There are two separate spreadheets, within which Laplace equation is solved by the BEM in two dimensions (LIBEM2) and axisymmetric three dimensions (LBEMA). The main algorithms are implemented in the associated programming language within Excel, Visual Basic for Applications (VBA). The BEM only requires a boundary mesh and hence it is a relatively accessible method. The BEM in the open spreadsheet environment is demonstrated as being useful as an aid to teaching and learning. The application of the BEM implemented on a spreadsheet for educational purposes in introductory vector calculus and simulation is explored. The development of assignment work is discussed, and sample results from student work are given. The spreadsheets were found to be useful tools in developing the students’ understanding of vector calculus and in simulating heat conduction.

Keywords: boundary element method, Laplace’s equation, vector calculus, simulation, education

Procedia PDF Downloads 137
481 End To End Process to Automate Batch Application

Authors: Nagmani Lnu

Abstract:

Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.

Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing

Procedia PDF Downloads 26
480 The Application of a Hybrid Neural Network for Recognition of a Handwritten Kazakh Text

Authors: Almagul Assainova , Dariya Abykenova, Liudmila Goncharenko, Sergey Sybachin, Saule Rakhimova, Abay Aman

Abstract:

The recognition of a handwritten Kazakh text is a relevant objective today for the digitization of materials. The study presents a model of a hybrid neural network for handwriting recognition, which includes a convolutional neural network and a multi-layer perceptron. Each network includes 1024 input neurons and 42 output neurons. The model is implemented in the program, written in the Python programming language using the EMNIST database, NumPy, Keras, and Tensorflow modules. The neural network training of such specific letters of the Kazakh alphabet as ә, ғ, қ, ң, ө, ұ, ү, h, і was conducted. The neural network model and the program created on its basis can be used in electronic document management systems to digitize the Kazakh text.

Keywords: handwriting recognition system, image recognition, Kazakh font, machine learning, neural networks

Procedia PDF Downloads 230