Search results for: fractional programming
320 Simulation and Experimental Research on Pocketing Operation for Toolpath Optimization in CNC Milling
Authors: Rakesh Prajapati, Purvik Patel, Avadhoot Rajurkar
Abstract:
Nowadays, manufacturing industries augment their production lines with modern machining centers backed by CAM software. Several attempts are being made to cut down the programming time for machining complex geometries. Special programs/software have been developed to generate the digital numerical data and to prepare NC programs by using suitable post-processors for different machines. By selecting the tools and manufacturing process then applying tool paths and NC program are generated. More and more complex mechanical parts that earlier were being cast and assembled/manufactured by other processes are now being machined. Majority of these parts require lots of pocketing operations and find their applications in die and mold, turbo machinery, aircraft, nuclear, defense etc. Pocketing operations involve removal of large quantity of material from the metal surface. The modeling of warm cast and clamping a piece of food processing parts which the used of Pro-E and MasterCAM® software. Pocketing operation has been specifically chosen for toolpath optimization. Then after apply Pocketing toolpath, Multi Tool Selection and Reduce Air Time give the results of software simulation time and experimental machining time.Keywords: toolpath, part program, optimization, pocket
Procedia PDF Downloads 288319 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System
Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta
Abstract:
This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate which minimize the total incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.Keywords: subcontracting, optimal control, deterioration, simulation, production planning
Procedia PDF Downloads 580318 Classification of Myoelectric Signals Using Multilayer Perceptron Neural Network with Back-Propagation Algorithm in a Wireless Surface Myoelectric Prosthesis of the Upper-Limb
Authors: Kevin D. Manalo, Jumelyn L. Torres, Noel B. Linsangan
Abstract:
This paper focuses on a wireless myoelectric prosthesis of the upper-limb that uses a Multilayer Perceptron Neural network with back propagation. The algorithm is widely used in pattern recognition. The network can be used to train signals and be able to use it in performing a function on their own based on sample inputs. The paper makes use of the Neural Network in classifying the electromyography signal that is produced by the muscle in the amputee’s skin surface. The gathered data will be passed on through the Classification Stage wirelessly through Zigbee Technology. The signal will be classified and trained to be used in performing the arm positions in the prosthesis. Through programming using Verilog and using a Field Programmable Gate Array (FPGA) with Zigbee, the EMG signals will be acquired and will be used for classification. The classified signal is used to produce the corresponding Hand Movements (Open, Pick, Hold, and Grip) through the Zigbee controller. The data will then be processed through the MLP Neural Network using MATLAB which then be used for the surface myoelectric prosthesis. Z-test will be used to display the output acquired from using the neural network.Keywords: field programmable gate array, multilayer perceptron neural network, verilog, zigbee
Procedia PDF Downloads 391317 Building Bridges: DePaul’s HSI Endeavor
Authors: Giana Aguilar-Valencia
Abstract:
This research focuses on DePaul University as a Hispanic-serving institution (HSI) and evaluates its present capacity to serve its Latinx students. Yet, despite being named an HSI, Latinx students regularly face challenges in academic performance, retention, and graduation. Following an extensive review of institutional programs, policies, and support systems, this study identifies gaps in the services provided to meet Latinx students' needs. Research for this project aims to suggest improvements to such programs to help nurture an all-encompassing and nurturing environment. Utilizing qualitative methods, including interviewees who are students, faculty, and staff members, the research focuses on the lived experiences of Latinx students attending DePaul. Institutional reports and demographic data are also incorporated to see if the HSI policies coincide with best practices for assisting Latinx populations. The research concludes with recommendations on the most practical way to adopt HSI strategies at DePaul, including additional mentorship opportunities, cultural programming, and academic support services. It is anticipated that such findings will contribute to larger discussions on HSIs and their shared role in promoting equity-oriented educational outcomes for Latinx students while at the same time informing DePaul's efforts to become a more inclusive institution for all its students.Keywords: Hispanic-Serving Institution, Latinx representation, retention rates, equity in education
Procedia PDF Downloads 6316 Biopolymer Nanoparticles Loaded with Calcium as a Source of Fertilizer
Authors: Erwin San Juan Martinez, Miguel Angel Aguilar Mendez, Manuel Sandoval Villa, Libia Iris Trejo Tellez
Abstract:
Some nanomaterials may improve the vegetal growth in certain concentration intervals, and could be used as nanofertilizers in order to increase crops yield, and decreasing the environmental pollution due to non-controlled use of conventional fertilizers, therefore the present investigation’s objective was to synthetize and characterize gelatin nanoparticles loaded with calcium generated through pulverization technique and be used as nanofertilizers. To obtain these materials, a fractional factorial design 27-4 was used in order to evaluate the largest number of factors (concentration of Ca2+, temperature and agitation time of the solution and calcium concentration, drying temperature, and % spray) with a possible effect on the size, distribution and morphology of nanoparticles. For the formation of nanoparticles, a Nano Spray-Dryer B - 90® (Buchi, Flawil, Switzerland), equipped with a spray cap of 4 µm was used. Size and morphology of the obtained nanoparticles were evaluated using a scanning electron microscope (JOEL JSM-6390LV model; Tokyo, Japan) equipped with an energy dispersive x-ray X (EDS) detector. The total quantification of Ca2+ as well as its release by the nanoparticles was carried out in an equipment of induction atomic emission spectroscopy coupled plasma (ICP-ES 725, Agilent, Mulgrave, Australia). Of the seven factors evaluated, only the concentration of fertilizer, % spray and concentration of polymer presented a statistically significant effect on particle size. Micrographs of SEM from six of the eight conditions evaluated in this research showed particles separated and with a good degree of sphericity, while in the other two particles had amorphous morphology and aggregation. In all treatments, most of the particles showed smooth surfaces. The average size of smallest particle obtained was 492 nm, while EDS results showed an even distribution of Ca2+ in the polymer matrix. The largest concentration of Ca2+ in ICP was 10.5%, which agrees with the theoretical value calculated, while the release kinetics showed an upward trend within 24 h. Using the technique employed in this research, it was possible to obtain nanoparticles loaded with calcium, of good size, sphericity and with release controlled properties. The characteristics of nanoparticles resulted from manipulation of the conditions of synthesis which allow control of the size and shape of the particles, and provides the means to adapt the properties of the materials to an specific application.Keywords: calcium, controlled release, gelatin, nano spraydryer, nanofertilizer
Procedia PDF Downloads 182315 Research on the Influence of Robot Teaching on the Creativity of Primary and Secondary School Students under the Background of STEM Education
Authors: Chu Liu
Abstract:
With the development of society and the changes of the times, the requirements for the cultivation of learners are different. In the 21st century, STEM education has become a boom in the development of education in various countries, aiming to improve the comprehensive ability of learners in science, technology, engineering, and mathematics. The rise of robot education provides an effective way for STEM education to cultivate computational thinking ability, interdisciplinary ability, problem-solving ability, and teamwork ability. Although robot education has been developed in China for several years, it still lacks a standard curriculum system. This article uses programming software as a platform, through the research and analysis of 'Basic Education Information Technology Curriculum Standards (2012 Edition)', combines with the actual learning situation of learners, tries to conduct teaching project design research, and aims at providing references for the teaching ideas and method of robot education courses. In contemporary society, technological advances increasingly require creativity. Innovative comprehensive talents urgently need a radical and effective education reform to keep up with social changes. So in this context, robot teaching design can be used for students. The tendency of creativity to influence is worth to be verified.Keywords: STEM education, robot teaching, primary and secondary school students, tendency of creativity
Procedia PDF Downloads 121314 Supplemental VisCo-friction Damping for Dynamical Structural Systems
Authors: Sharad Singh, Ajay Kumar Sinha
Abstract:
Coupled dampers like viscoelastic-frictional dampers for supplemental damping are a newer technique. In this paper, innovative Visco-frictional damping models have been presented and investigated. This paper attempts to couple frictional and fluid viscous dampers into a single unit of supplemental dampers. Visco-frictional damping model is developed by series and parallel coupling of frictional and fluid viscous dampers using Maxwell and Kelvin-Voigat models. The time analysis has been performed using numerical simulation on an SDOF system with varying fundamental periods, subject to a set of 12 ground motions. The simulation was performed using the direct time integration method. MATLAB programming tool was used to carry out the numerical simulation. The response behavior has been analyzed for the varying time period and added damping. This paper compares the response reduction behavior of the two modes of coupling. This paper highlights the performance efficiency of the suggested damping models. It also presents a mathematical modeling approach to visco-frictional dampers and simultaneously suggests the suitable mode of coupling between the two sub-units.Keywords: hysteretic damping, Kelvin model, Maxwell model, parallel coupling, series coupling, viscous damping
Procedia PDF Downloads 158313 Fuzzy Total Factor Productivity by Credibility Theory
Authors: Shivi Agarwal, Trilok Mathur
Abstract:
This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index
Procedia PDF Downloads 367312 Design of Transmit Beamspace and DOA Estimation in MIMO Radar
Authors: S. Ilakkiya, A. Merline
Abstract:
A multiple-input multiple-output (MIMO) radar systems use modulated waveforms and directive antennas to transmit electromagnetic energy into a specific volume in space to search for targets. This paper deals with the design of transmit beamspace matrix and DOA estimation for multiple-input multiple-output (MIMO) radar with collocated antennas.The design of transmit beamspace matrix is based on minimizing the difference between a desired transmit beampattern and the actual one while enforcing the constraint of uniform power distribution across the transmit array elements. Rotational invariance property is established at the transmit array by imposing a specific structure on the beamspace matrix. Semidefinite programming and spatial-division based design (SDD) are also designed separately. In MIMO radar systems, DOA estimation is an essential process to determine the direction of incoming signals and thus to direct the beam of the antenna array towards the estimated direction. This estimation deals with non-adaptive spectral estimation and adaptive spectral estimation techniques. The design of the transmit beamspace matrix and spectral estimation techniques are studied through simulation.Keywords: adaptive and non-adaptive spectral estimation, direction of arrival estimation, MIMO radar, rotational invariance property, transmit, receive beamforming
Procedia PDF Downloads 520311 An Improved Image Steganography Technique Based on Least Significant Bit Insertion
Authors: Olaiya Folorunsho, Comfort Y. Daramola, Joel N. Ugwu, Lawrence B. Adewole, Olufisayo S. Ekundayo
Abstract:
In today world, there is a tremendous rise in the usage of internet due to the fact that almost all the communication and information sharing is done over the web. Conversely, there is a continuous growth of unauthorized access to confidential data. This has posed a challenge to information security expertise whose major goal is to curtail the menace. One of the approaches to secure the safety delivery of data/information to the rightful destination without any modification is steganography. Steganography is the art of hiding information inside an embedded information. This research paper aimed at designing a secured algorithm with the use of image steganographic technique that makes use of Least Significant Bit (LSB) algorithm for embedding the data into the bit map image (bmp) in order to enhance security and reliability. In the LSB approach, the basic idea is to replace the LSB of the pixels of the cover image with the Bits of the messages to be hidden without destroying the property of the cover image significantly. The system was implemented using C# programming language of Microsoft.NET framework. The performance evaluation of the proposed system was experimented by conducting a benchmarking test for analyzing the parameters like Mean Squared Error (MSE) and Peak Signal to Noise Ratio (PSNR). The result showed that image steganography performed considerably in securing data hiding and information transmission over the networks.Keywords: steganography, image steganography, least significant bits, bit map image
Procedia PDF Downloads 267310 Integrating Building Information Modeling into Facilities Management Operations
Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi
Abstract:
Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.Keywords: building information modeling, facility management, operational phase, building life cycle
Procedia PDF Downloads 155309 A Study on the Effect of Design Factors of Slim Keyboard’s Tactile Feedback
Authors: Kai-Chieh Lin, Chih-Fu Wu, Hsiang Ling Hsu, Yung-Hsiang Tu, Chia-Chen Wu
Abstract:
With the rapid development of computer technology, the design of computers and keyboards moves towards a trend of slimness. The change of mobile input devices directly influences users’ behavior. Although multi-touch applications allow entering texts through a virtual keyboard, the performance, feedback, and comfortableness of the technology is inferior to traditional keyboard, and while manufacturers launch mobile touch keyboards and projection keyboards, the performance has not been satisfying. Therefore, this study discussed the design factors of slim pressure-sensitive keyboards. The factors were evaluated with an objective (accuracy and speed) and a subjective evaluation (operability, recognition, feedback, and difficulty) depending on the shape (circle, rectangle, and L-shaped), thickness (flat, 3mm, and 6mm), and force (35±10g, 60±10g, and 85±10g) of the keyboard. Moreover, MANOVA and Taguchi methods (regarding signal-to-noise ratios) were conducted to find the optimal level of each design factor. The research participants, by their typing speed (30 words/ minute), were divided in two groups. Considering the multitude of variables and levels, the experiments were implemented using the fractional factorial design. A representative model of the research samples were established for input task testing. The findings of this study showed that participants with low typing speed primarily relied on vision to recognize the keys, and those with high typing speed relied on tactile feedback that was affected by the thickness and force of the keys. In the objective and subjective evaluation, a combination of keyboard design factors that might result in higher performance and satisfaction was identified (L-shaped, 3mm, and 60±10g) as the optimal combination. The learning curve was analyzed to make a comparison with a traditional standard keyboard to investigate the influence of user experience on keyboard operation. The research results indicated the optimal combination provided input performance to inferior to a standard keyboard. The results could serve as a reference for the development of related products in industry and for applying comprehensively to touch devices and input interfaces which are interacted with people.Keywords: input performance, mobile device, slim keyboard, tactile feedback
Procedia PDF Downloads 300308 Utilization of an Object Oriented Tool to Perform Model-Based Safety Analysis According to Extended Failure System Models
Authors: Royia Soliman, Salma ElAnsary, Akram Amin Abdellatif, Florian Holzapfel
Abstract:
Model-Based Safety Analysis (MBSA) is an approach in which the system and safety engineers share a common system model created using a model-based development process. The model can also be extended by the failure modes of the system components. There are two famous approaches for the addition of fault behaviors to system models. The first one is to enclose the failure into the system design directly. The second approach is to develop a fault model separately from the system model, thus combining both independent models for safety analysis. This paper introduces a hybrid approach of MBSA. The approach tries to use informal abstracted models to investigate failure behaviors. The approach will combine various concepts such as directed graph traversal, event lists and Constraint Satisfaction Problems (CSP). The approach is implemented using an Object Oriented programming language. The components are abstracted to its failure logic and relationships of connected components. The implemented approach is tested on various flight control systems, including electrical and multi-domain examples. The various tests are analyzed, and a comparison to different approaches is represented.Keywords: flight control systems, model based safety analysis, safety assessment analysis, system modelling
Procedia PDF Downloads 165307 Design of Semi-Autonomous Street Cleaning Vehicle
Authors: Khouloud Safa Azoud, Süleyman Baştürk
Abstract:
In the pursuit of cleaner and more sustainable urban environments, advanced technologies play a critical role in evolving sanitation systems. This paper presents two distinct advancements in automated cleaning machines designed to improve urban sanitation. The first advancement is a semi-automatic road surface cleaning machine that integrates human labor with solar energy to enhance environmental sustainability and adaptability, especially in regions with limited access to electricity. By reducing carbon emissions and increasing operational efficiency, this approach offers significant potential for urban sanitation enhancement. The second advancement is a multifunctional semi-automatic street cleaning machine equipped with a camera, Arduino programming, and GPS for an autonomous operation aimed at addressing cost barriers in developing countries. Prioritizing low energy consumption and cost-effectiveness, this machine provides versatile cleaning solutions adaptable to various environmental conditions. By integrating solar energy with autonomous operating systems and careful design, these developments represent substantial progress in sustainable urban sanitation, particularly in developing regions.Keywords: automated cleaning machines, solar energy integration, operational efficiency, urban sanitation systems
Procedia PDF Downloads 38306 Logistics Optimization: A Literature Review of Techniques for Streamlining Land Transportation in Supply Chain Operations
Authors: Danica Terese Valda, Segundo Villa III, Michiko Yasuda, Jomel Tagaro
Abstract:
This study conducts a thorough literature review of logistics optimization techniques that aimed at improving the efficiency of supply chain operations. Logistics optimization encompasses key areas such as transportation management, inventory control, and distribution network design, each of which plays a critical role in streamlining supply chain performance. The review identifies mixed-integer linear programming (MILP) as a dominant method, widely used for its flexibility in handling complex logistics problems. Other methods like heuristic algorithms and combinatorial optimization also prove effective in solving large-scale logistics challenges. Furthermore, real-time data integration and advancements in simulation techniques are transforming the decision-making processes within supply chains, leading to more dynamic and responsive operations. The inclusion of sustainability goals, particularly in minimizing carbon emissions, has emerged as a growing trend in logistics optimization. This research highlights the need for integrated, holistic approaches that consider the interconnectedness of logistical components. The findings provide valuable insights to guide future research and practical applications, fostering more resilient and efficient supply chains.Keywords: logistics, techniques, supply chain, land transportation
Procedia PDF Downloads 10305 Monitoring of Hydrological Parameters in the Alexandra Jukskei Catchment in South Africa
Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera
Abstract:
It has been noted that technical programming for handling groundwater resources is not accessible. The lack of these systems hinders groundwater management processes necessary for decision-making through monitoring and evaluation regarding the Jukskei River of the Crocodile River (West) Basin in Johannesburg, South Africa. Several challenges have been identified in South Africa's Jukskei Catchment concerning groundwater management. Some of those challenges will include the following: Gaps in data records; there is a need for training and equipping of monitoring staff; formal accreditation of monitoring capacities and equipment; there is no access to regulation terms (e.g., meters). Taking into consideration necessities and human requirements as per typical densities in various regions of South Africa, there is a need to construct several groundwater level monitoring stations in a particular segment; the available raw data on groundwater level should be converted into consumable products for example, short reports on delicate areas (e.g., Dolomite compartments, wetlands, aquifers, and sole source) and considering the increasing civil unrest there has been vandalism and theft of groundwater monitoring infrastructure. GIS was employed at the catchment level to plot the relationship between those identified groundwater parameters in the catchment area and the identified borehole. GIS-based maps were designed for groundwater monitoring to be pretested on one borehole in the Jukskei catchment. This data will be used to establish changes in the borehole compared to changes in the catchment area according to identified parameters.Keywords: GIS, monitoring, Jukskei, catchment
Procedia PDF Downloads 94304 Research of Control System for Space Intelligent Robot Based on Vision Servo
Authors: Changchun Liang, Xiaodong Zhang, Xin Liu, Pengfei Sun
Abstract:
Space intelligent robotic systems are expected to play an increasingly important role in the future. The robotic on-orbital service, whose key is the tracking and capturing technology, becomes research hot in recent years. In this paper, the authors propose a vision servo control system for target capturing. Robotic manipulator will be an intelligent robotic system with large-scale movement, functional agility, and autonomous ability, and it can be operated by astronauts in the space station or be controlled by the ground operator in the remote operation mode. To realize the autonomous movement and capture mission of SRM, a kind of autonomous programming strategy based on multi-camera vision fusion is designed and the selection principle of object visual position and orientation measurement information is defined for the better precision. Distributed control system hierarchy is designed and reliability is considering to guarantee the abilities of control system. At last, a ground experiment system is set up based on the concept of robotic control system. With that, the autonomous target capturing experiments are conducted. The experiment results validate the proposed algorithm, and demonstrates that the control system can fulfill the needs of function, real-time and reliability.Keywords: control system, on-orbital service, space robot, vision servo
Procedia PDF Downloads 419303 Syntactic Analyzer for Tamil Language
Authors: Franklin Thambi Jose.S
Abstract:
Computational Linguistics is a branch of linguistics, which deals with the computer and linguistic levels. It is also said, as a branch of language studies which applies computer techniques to linguistics field. In Computational Linguistics, Natural Language Processing plays an important role. This came to exist because of the invention of Information Technology. In computational syntax, the syntactic analyser breaks a sentence into phrases and clauses and identifies the sentence with the syntactic information. Tamil is one of the major Dravidian languages, which has a very long written history of more than 2000 years. It is mainly spoken in Tamilnadu (in India), Srilanka, Malaysia and Singapore. It is an official language in Tamilnadu (in India), Srilanka, Malaysia and Singapore. In Malaysia Tamil speaking people are considered as an ethnic group. In Tamil syntax, the sentences in Tamil are classified into four for this research, namely: 1. Main Sentence 2. Interrogative Sentence 3. Equational Sentence 4. Elliptical Sentence. In computational syntax, the first step is to provide required information regarding the head and its constituent of each sentence. This information will be incorporated to the system using programming languages. Now the system can easily analyse a given sentence with the criteria or mechanisms given to it. Providing needful criteria or mechanisms to the computer to identify the basic types of sentences using Syntactic parser in Tamil language is the major objective of this paper.Keywords: tamil, syntax, criteria, sentences, parser
Procedia PDF Downloads 517302 Impact of Wind Energy on Cost and Balancing Reserves
Authors: Anil Khanal, Ali Osareh, Gary Lebby
Abstract:
Wind energy offers a significant advantage such as no fuel costs and no emissions from generation. However, wind energy sources are variable and non-dispatchable. The utility grid is able to accommodate the variability of wind in smaller proportion along with the daily load. However, at high penetration levels, the variability can severely impact the utility reserve requirements and the cost associated with it. In this paper, the impact of wind energy is evaluated in detail in formulating the total utility cost. The objective is to minimize the overall cost of generation while ensuring the proper management of the load. Overall cost includes the curtailment cost, reserve cost and the reliability cost as well as any other penalty imposed by the regulatory authority. Different levels of wind penetrations are explored and the cost impacts are evaluated. As the penetration level increases significantly, the reliability becomes a critical question to be answered. Here, we increase the penetration from the wind yet keep the reliability factor within the acceptable limit provided by NERC. This paper uses an economic dispatch (ED) model to incorporate wind generation into the power grid. Power system costs are analyzed at various wind penetration levels using Linear Programming. The goal of this study shows how the increases in wind generation will affect power system economics.Keywords: wind power generation, wind power penetration, cost analysis, economic dispatch (ED) model
Procedia PDF Downloads 567301 Experimental and Numerical Investigation of Heat Transfer in THTL Test Loop Shell and Tube Heat Exchanger
Authors: M. Moody, R. Mahmoodi, A. R. Zolfaghari, A. Aminottojari
Abstract:
In this study, flow inside the shell side of a shell-and-tube heat exchanger is simulated numerically for laminar and turbulent flows in both steady state and transient mode. Governing equations of fluid flow are discrete using finite volume method and central difference scheme and solved with simple algorithm which is staggered grid by using MATLAB programming language. The heat transfer coefficient is obtained using velocity field from equation Dittus-Bolter. In comparison with, heat exchanger is simulated with ANSYS CFX software and experimental data measured in the THTL test loop. Numerical results obtained from the study show good agreement with experimental data and ANSYS CFX results. In addition, by deliberation the effect of the baffle spacing and the baffle cut on the heat transfer rate for turbulent flow, it is illustrated that the heat transfer rate depends on the baffle spacing and the baffle cut directly. In other word in spied of large turbulence, if these two parameters are not selected properly in the heat exchanger, the heat transfer rate can reduce.Keywords: shell-and-tube heat exchanger, flow and heat transfer, laminar and turbulence flow, turbulence model, baffle spacing, baffle cut
Procedia PDF Downloads 538300 A Proposed Model of E-Marketing Service-Oriented Architecture (E-MSOA)
Authors: Hussein Moselhy, Islam Salam
Abstract:
There have been some challenges and problems which hinder the implementation of the e-marketing systems such as the high cost of information systems infrastructure and maintenance as well as their unavailability within the institution. Also, there is no system which supports all programming languages and different platforms. Another problem is the lack of integration between these systems on one hand and the operating systems and different web browsers on the other hand. No system for customer relationship management is established which recognizes their desires and puts them in consideration while performing e-marketing functions is available. Therefore, the service-oriented architecture emerged as one of the most important techniques and methodologies to build systems that integrate with various operating systems and different platforms and other technologies. This technology allows realizing the data exchange among different applications. The service-oriented architecture represents distributed computing concepts to demonstrate its success in achieving the requirements of systems through web services. It also reflects the appropriate design for the services to use different web services in supporting the requirements of business processes and software users. In a service-oriented environment, web services are deployed on the web in the form of independent services to be accessed without knowledge of the nature of the programs and systems with in. This Paper presents a proposal for a new model which contributes to the application of methods and means of e-marketing with the integration of marketing mix elements to improve marketing efficiency (E-MSOA). And apply it in the educational city of one of the Egyptian sector.Keywords: service-oriented architecture, electronic commerce, virtual retailing, unified modeling language
Procedia PDF Downloads 428299 Assessment the Correlation of Rice Yield Traits by Simulation and Modelling Methods
Authors: Davood Barari Tari
Abstract:
In order to investigate the correlation of rice traits in different nitrogen management methods by modeling programming, an experiment was laid out in rice paddy field in an experimental field at Caspian Coastal Sea region from 2013 to 2014. Variety used was Shiroudi as a high yielding variety. Nitrogen management was in two methods. Amount of nitrogen at four levels (30, 60, 90, and 120 Kg N ha-1 and control) and nitrogen-splitting at four levels (T1: 50% in base + 50% in maximum tillering stage, T2= 33.33% basal +33.33% in maximum tillering stage +33.33% in panicle initiation stage, T3=25% basal+37.5% in maximum tillering stage +37.5% in panicle initiation stage, T4: 25% in basal + 25% in maximum tillering stage + 50% in panicle initiation stage). Results showed that nitrogen traits, total grain number, filled spikelets, panicle number per m2 had a significant correlation with grain yield. Results related to calibrated and validation of rice model methods indicated that correlation between rice yield and yield components was accurate. The correlation between panicle length and grain yield was minimum. Physiological indices was simulated with low accuracy. According to results, investigation of the correlation between rice traits in physiological, morphological and phenological characters and yield by modeling and simulation methods are very useful.Keywords: rice, physiology, modelling, simulation, yield traits
Procedia PDF Downloads 344298 A Group Setting of IED in Microgrid Protection Management System
Authors: Jyh-Cherng Gu, Ming-Ta Yang, Chao-Fong Yan, Hsin-Yung Chung, Yung-Ruei Chang, Yih-Der Lee, Chen-Min Chan, Chia-Hao Hsu
Abstract:
There are a number of distributed generations (DGs) installed in microgrid, which may have diverse path and direction of power flow or fault current. The overcurrent protection scheme for the traditional radial type distribution system will no longer meet the needs of microgrid protection. Integrating the intelligent electronic device (IED) and a supervisory control and data acquisition (SCADA) with IEC 61850 communication protocol, the paper proposes a microgrid protection management system (MPMS) to protect power system from the fault. In the proposed method, the MPMS performs logic programming of each IED to coordinate their tripping sequence. The GOOSE message defined in IEC 61850 is used as the transmission information medium among IEDs. Moreover, to cope with the difference in fault current of microgrid between grid-connected mode and islanded mode, the proposed MPMS applies the group setting feature of IED to protect system and robust adaptability. Once the microgrid topology varies, the MPMS will recalculate the fault current and update the group setting of IED. Provided there is a fault, IEDs will isolate the fault at once. Finally, the Matlab/Simulink and Elipse Power Studio software are used to simulate and demonstrate the feasibility of the proposed method.Keywords: IEC 61850, IED, group Setting, microgrid
Procedia PDF Downloads 463297 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows
Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid
Abstract:
Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil
Procedia PDF Downloads 130296 The Effect of Artificial Intelligence on Finance, Banking and Insurance
Authors: Sherine Shahat Abdelnour Bastourous
Abstract:
Banking and monetary offerings are rapidly transitioning from being monolithic structures focusing simply on their personal economic services to becoming integrated gamers in a couple of customer journeys and delivery chains. Banks themselves are refocusing on being liquidity carriers and underwriters in those networks, whilst the overall idea of ‘embeddedness’ builds on the market conveniently available API (software Programming Interface) architectures to flexibly supply services to numerous requestors, i.e., online shops who want finance and insurance products to better serve their clients, respectively. With this flexibility come new necessities for more advantageous cybersecurity. API structures are greater decentralized and inherently vulnerable to trade. lamentably, this has now not been comprehensively addressed inside the literature. This paper attempts to fill this hole through looking at security tactics and technology relevant to API architectures found in embedded finance. After offering the research method implemented and introducing the essential bodies of understanding worried, the paper will speak six dominating era developments shaping excessive-degree monetary services architectures. Ultimately, embedded finance and the respective usage of API techniques might be described. building in this, safety concerns for APIs in monetary and insurance offerings will be elaborated on earlier than concluding with a few ideas for viable similar studies.Keywords: finance, non-interest, sustainability, enlightenment health, out of pocket expenditure, universal healthcare
Procedia PDF Downloads 6295 Customer Preference in the Textile Market: Fabric-Based Analysis
Authors: Francisca Margarita Ocran
Abstract:
Underwear, and more particularly bras and panties, are defined as intimate clothing. Strictly speaking, they enhance the place of women in the public or private satchel. Therefore, women's lingerie is a complex garment with a high involvement profile, motivating consumers to buy it not only by its functional utility but also by the multisensory experience it provides them. Customer behavior models are generally based on customer data mining, and each model is designed to answer questions at a specific time. Predicting the customer experience is uncertain and difficult. Thus, knowledge of consumers' tastes in lingerie deserves to be treated as an experiential product, where the dimensions of the experience motivating consumers to buy a lingerie product and to remain faithful to it must be analyzed in detail by the manufacturers and retailers to engage and retain consumers, which is why this research aims to identify the variables that push consumers to choose their lingerie product, based on an in-depth analysis of the types of fabrics used to make lingerie. The data used in this study comes from online purchases. Machine learning approach with the use of Python programming language and Pycaret gives us a precision of 86.34%, 85.98%, and 84.55% for the three algorithms to use concerning the preference of a buyer in front of a range of lingerie. Gradient Boosting, random forest, and K Neighbors were used in this study; they are very promising and rich in the classification of preference in the textile industry.Keywords: consumer behavior, data mining, lingerie, machine learning, preference
Procedia PDF Downloads 91294 Second Order Optimality Conditions in Nonsmooth Analysis on Riemannian Manifolds
Authors: Seyedehsomayeh Hosseini
Abstract:
Much attention has been paid over centuries to understanding and solving the problem of minimization of functions. Compared to linear programming and nonlinear unconstrained optimization problems, nonlinear constrained optimization problems are much more difficult. Since the procedure of finding an optimizer is a search based on the local information of the constraints and the objective function, it is very important to develop techniques using geometric properties of the constraints and the objective function. In fact, differential geometry provides a powerful tool to characterize and analyze these geometric properties. Thus, there is clearly a link between the techniques of optimization on manifolds and standard constrained optimization approaches. Furthermore, there are manifolds that are not defined as constrained sets in R^n an important example is the Grassmann manifolds. Hence, to solve optimization problems on these spaces, intrinsic methods are used. In a nondifferentiable problem, the gradient information of the objective function generally cannot be used to determine the direction in which the function is decreasing. Therefore, techniques of nonsmooth analysis are needed to deal with such a problem. As a manifold, in general, does not have a linear structure, the usual techniques, which are often used in nonsmooth analysis on linear spaces, cannot be applied and new techniques need to be developed. This paper presents necessary and sufficient conditions for a strict local minimum of extended real-valued, nonsmooth functions defined on Riemannian manifolds.Keywords: Riemannian manifolds, nonsmooth optimization, lower semicontinuous functions, subdifferential
Procedia PDF Downloads 361293 Effects of Artificial Intelligence Technology on Children: Positives and Negatives
Authors: Paula C. Latorre Arroyo, Andrea C. Latorre Arroyo
Abstract:
In the present society, children are exposed to and impacted by technology from very early on in various ways. Artificial intelligence (AI), in particular, directly affects them, be it positively or negatively. The concept of artificial intelligence is commonly defined as the technological programming of computers or robotic mechanisms with humanlike capabilities and characteristics. These technologies are often designed as helpful machines or disguised as handy tools that could ultimately steal private information for illicit purposes. Children, being one of the most vulnerable groups due to their lack of experience and knowledge, do not have the ability to recognize or have the malice to distinguish if an apparatus with artificial intelligence is good or bad for them. For this reason, as a society, there must be a sense of responsibility to regulate and monitor different types of uses for artificial intelligence to protect children from potential risks that might arise. This article aims to investigate the many implications that artificial intelligence has in the lives of children, starting from a home setting, within the classroom, and, ultimately, in online spaces. Irrefutably, there are numerous beneficial aspects to the use of artificial intelligence. However, due to its limitless potential and lack of specific and substantial regulations to prevent the illicit use of this technology, safety and privacy concerns surface, specifically regarding the youth. This written work aims to provide an in-depth analysis of how artificial intelligence can both help children and jeopardize their safety. Concluding with resources and data supporting the aforementioned statement.Keywords: artificial intelligence, children, privacy, rights, safety
Procedia PDF Downloads 67292 Life Time Improvement of Clamp Structural by Using Fatigue Analysis
Authors: Pisut Boonkaew, Jatuporn Thongsri
Abstract:
In hard disk drive manufacturing industry, the process of reducing an unnecessary part and qualifying the quality of part before assembling is important. Thus, clamp was designed and fabricated as a fixture for holding in testing process. Basically, testing by trial and error consumes a long time to improve. Consequently, the simulation was brought to improve the part and reduce the time taken. The problem is the present clamp has a low life expectancy because of the critical stress that occurred. Hence, the simulation was brought to study the behavior of stress and compressive force to improve the clamp expectancy with all probability of designs which are present up to 27 designs, which excluding the repeated designs. The probability was calculated followed by the full fractional rules of six sigma methodology which was provided correctly. The six sigma methodology is a well-structured method for improving quality level by detecting and reducing the variability of the process. Therefore, the defective will be decreased while the process capability increasing. This research focuses on the methodology of stress and fatigue reduction while compressive force still remains in the acceptable range that has been set by the company. In the simulation, ANSYS simulates the 3D CAD with the same condition during the experiment. Then the force at each distance started from 0.01 to 0.1 mm will be recorded. The setting in ANSYS was verified by mesh convergence methodology and compared the percentage error with the experimental result; the error must not exceed the acceptable range. Therefore, the improved process focuses on degree, radius, and length that will reduce stress and still remain in the acceptable force number. Therefore, the fatigue analysis will be brought as the next process in order to guarantee that the lifetime will be extended by simulating through ANSYS simulation program. Not only to simulate it, but also to confirm the setting by comparing with the actual clamp in order to observe the different of fatigue between both designs. This brings the life time improvement up to 57% compared with the actual clamp in the manufacturing. This study provides a precise and trustable setting enough to be set as a reference methodology for the future design. Because of the combination and adaptation from the six sigma method, finite element, fatigue and linear regressive analysis that lead to accurate calculation, this project will able to save up to 60 million dollars annually.Keywords: clamp, finite element analysis, structural, six sigma, linear regressive analysis, fatigue analysis, probability
Procedia PDF Downloads 235291 Efficient Numerical Simulation for LDC
Authors: Badr Alkahtani
Abstract:
In this poster, numerical solutions of two-dimensional and three-dimensional lid driven cavity are presented by solving the steady Navier-Stokes equations at high Reynolds numbers where it becomes difficult. Lid driven cavity is where the a fluid contained in a cube and the upper wall is moving. In two dimensions, we use the streamfunction-vorticity formulation to solve the problem in a square domain. A numerical method is employed to discretize the problem in the x and y directions with a spectral collocation method. The problem is coded in the MATLAB programming environment. Solutions at high Reynolds numbers are obtained up to Re=20000 on a fine grid of 131 * 131. Also in this presentation, the numerical solutions for the three-dimensional lid-driven cavity problem are obtained by solving the velocity-vorticity formulation of the Navier-Stokes equations (which is the first time that this has been simulated with special boundary conditions) for various Reynolds numbers. A spectral collocation method is employed to discretize the y and z directions and a finite difference method is used to discretize the x direction. Numerical solutions are obtained for Reynolds number up to 200. , The work prepared here is to show the efficiency of methods used to simulate the physical problem where accurate simulations of lid driven cavity are obtained at high Reynolds number as mentioned above. The result for the two dimensional problem is far from the previous researcher result.Keywords: lid driven cavity, navier-stokes, simulation, Reynolds number
Procedia PDF Downloads 716