Search results for: AIMMS mathematical software
4893 Normalized P-Laplacian: From Stochastic Game to Image Processing
Authors: Abderrahim Elmoataz
Abstract:
More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems
Procedia PDF Downloads 5134892 Visual Simulation for the Relationship of Urban Fabric
Authors: Ting-Yu Lin, Han-Liang Lin
Abstract:
This article is about the urban form of visualization by Cityengine. City is composed of different domains, and each domain has its own fabric because of arrangement. For example, a neighborhood unit contains fabrics such as schools, street networks, residential and commercial spaces. Therefore, studying urban morphology can help us understand the urban form in planning process. Streets, plots, and buildings seem as urban fabrics, and they configure urban form. Traditionally, urban morphology usually discussed single parameter, which is building type, ignoring other parameters such as streets and plots. However, urban space is three-dimensional, instead of two-dimensional. People perceive urban space by their visualization. Therefore, using visualization can fill the gap between two dimensions and three dimensions. Hence, the study of urban morphology will strengthen the understanding of whole appearance of a city. Cityengine is a software which can edit, analyze and monitor the data and visualize the result for GIS, a common tool to analyze data and display the map for urban plan and urban design. Cityengine can parameterize the data of streets, plots and building types and visualize the result in three-dimensional way. The research will reappear the real urban form by visualizing. We can know whether the urban form can be parameterized and the parameterized result can match the real urban form. Then, visualizing the result by software in three dimension to analyze the rule of urban form. There will be three stages of the research. It will start with a field survey of Tainan East District in Taiwan to conclude the relationships between urban fabrics of street networks, plots and building types. Second, to visualize the relationship, it will turn the relationship into codes which Cityengine can read. Last, Cityengine will automatically display the result by visualizing.Keywords: Cityengine, urban fabric, urban morphology, visual simulation
Procedia PDF Downloads 2994891 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up
Authors: Okhee Woo
Abstract:
Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose
Procedia PDF Downloads 2004890 Municipal Solid Waste Management Using Life Cycle Assessment Approach: Case Study of Maku City, Iran
Authors: L. Heidari, M. Jalili Ghazizade
Abstract:
This paper aims to determine the best environmental and economic scenario for Municipal Solid Waste (MSW) management of the Maku city by using Life Cycle Assessment (LCA) approach. The functional elements of this study are collection, transportation, and disposal of MSW in Maku city. Waste composition and density, as two key parameters of MSW, have been determined by field sampling, and then, the other important specifications of MSW like chemical formula, thermal energy and water content were calculated. These data beside other information related to collection and disposal facilities are used as a reliable source of data to assess the environmental impacts of different waste management options, including landfills, composting, recycling and energy recovery. The environmental impact of MSW management options has been investigated in 15 different scenarios by Integrated Waste Management (IWM) software. The photochemical smog, greenhouse gases, acid gases, toxic emissions, and energy consumption of each scenario are measured. Then, the environmental indices of each scenario are specified by weighting these parameters. Economic costs of scenarios have been also compared with each other based on literature. As final result, since the organic materials make more than 80% of the waste, compost can be a suitable method. Although the major part of the remaining 20% of waste can be recycled, due to the high cost of necessary equipment, the landfill option has been suggested. Therefore, the scenario with 80% composting and 20% landfilling is selected as superior environmental and economic scenario. This study shows that, to select a scenario with practical applications, simultaneously environmental and economic aspects of different scenarios must be considered.Keywords: IWM software, life cycle assessment, Maku, municipal solid waste management
Procedia PDF Downloads 2414889 Review and Analysis of Parkinson's Tremor Genesis Using Mathematical Model
Authors: Pawan Kumar Gupta, Sumana Ghosh
Abstract:
Parkinson's Disease (PD) is a long-term neurodegenerative movement disorder of the central nervous system with vast symptoms related to the motor system. The common symptoms of PD are tremor, rigidity, bradykinesia/akinesia, and postural instability, but the clinical symptom includes other motor and non‐motor issues. The motor symptoms of the disease are consequence of death of the neurons in a region of the midbrain known as substantia nigra pars compacta, leading to decreased level of a neurotransmitter known as dopamine. The cause of this neuron death is not clearly known but involves formation of Lewy bodies, an abnormal aggregation or clumping of the protein alpha-synuclein in the neurons. Unfortunately, there is no cure for PD, and the management of this disease is challenging. Therefore, it is critical for a patient to be diagnosed at early stages. A limited choice of drugs is available to improve the symptoms, but those become less and less effective over time. Apart from that, with rapid growth in the field of science and technology, other methods such as multi-area brain stimulation are used to treat patients. In order to develop advanced techniques and to support drug development for treating PD patients, an accurate mathematical model is needed to explain the underlying relationship of dopamine secretion in the brain with the hand tremors. There has been a lot of effort in the past few decades on modeling PD tremors and treatment effects from a computational point of view. These models can effectively save time as well as the cost of drug development for the pharmaceutical industry and be helpful for selecting appropriate treatment mechanisms among all possible options. In this review paper, an effort is made to investigate studies on PD modeling and analysis and to highlight some of the key advances in the field over the past centuries with discussion on the current challenges.Keywords: Parkinson's disease, deep brain stimulation, tremor, modeling
Procedia PDF Downloads 1414888 Effect of 3-Dimensional Knitted Spacer Fabrics Characteristics on Its Thermal and Compression Properties
Authors: Veerakumar Arumugam, Rajesh Mishra, Jiri Militky, Jana Salacova
Abstract:
The thermo-physiological comfort and compression properties of knitted spacer fabrics have been evaluated by varying the different spacer fabric parameters. Air permeability and water vapor transmission of the fabrics were measured using the Textest FX-3300 air permeability tester and PERMETEST. Then thermal behavior of fabrics was obtained by Thermal conductivity analyzer and overall moisture management capacity was evaluated by moisture management tester. Spacer Fabrics compression properties were also tested using Kawabata Evaluation System (KES-FB3). In the KES testing, the compression resilience, work of compression, linearity of compression and other parameters were calculated from the pressure-thickness curves. Analysis of Variance (ANOVA) was performed using new statistical software named QC expert trilobite and Darwin in order to compare the influence of different fabric parameters on thermo-physiological and compression behavior of samples. This study established that the raw materials, type of spacer yarn, density, thickness and tightness of surface layer have significant influence on both thermal conductivity and work of compression in spacer fabrics. The parameter which mainly influence on the water vapor permeability of these fabrics is the properties of raw material i.e. the wetting and wicking properties of fibers. The Pearson correlation between moisture capacity of the fabrics and water vapour permeability was found using statistical software named QC expert trilobite and Darwin. These findings are important requirements for the further designing of clothing for extreme environmental conditions.Keywords: 3D spacer fabrics, thermal conductivity, moisture management, work of compression (WC), resilience of compression (RC)
Procedia PDF Downloads 5464887 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software
Authors: Michael Williams
Abstract:
The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.Keywords: well control, fluid mechanics, safety, environment
Procedia PDF Downloads 1744886 Micro-Meso 3D FE Damage Modelling of Woven Carbon Fibre Reinforced Plastic Composite under Quasi-Static Bending
Authors: Aamir Mubashar, Ibrahim Fiaz
Abstract:
This research presents a three-dimensional finite element modelling strategy to simulate damage in a quasi-static three-point bending analysis of woven twill 2/2 type carbon fibre reinforced plastic (CFRP) composite on a micro-meso level using cohesive zone modelling technique. A meso scale finite element model comprised of a number of plies was developed in the commercial finite element code Abaqus/explicit. The interfaces between the plies were explicitly modelled using cohesive zone elements to allow for debonding by crack initiation and propagation. Load-deflection response of the CRFP within the quasi-static range was obtained and compared with the data existing in the literature. This provided validation of the model at the global scale. The outputs resulting from the global model were then used to develop a simulation model capturing the micro-meso scale material features. The sub-model consisted of a refined mesh representative volume element (RVE) modelled in texgen software, which was later embedded with cohesive elements in the finite element software environment. The results obtained from the developed strategy were successful in predicting the overall load-deflection response and the damage in global and sub-model at the flexure limit of the specimen. Detailed analysis of the effects of the micro-scale features was carried out.Keywords: woven composites, multi-scale modelling, cohesive zone, finite element model
Procedia PDF Downloads 1414885 Intelligent Control of Bioprocesses: A Software Application
Authors: Mihai Caramihai, Dan Vasilescu
Abstract:
The main research objective of the experimental bioprocess analyzed in this paper was to obtain large biomass quantities. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The reactor was equipped with pH, temperature, dissolved oxygen, and agitation controllers. The operating parameters were 37 oC, 1.2 atm, 250 rpm and air flow rate of 15 L/min. The main objective of this paper is to present a case study to demonstrate that intelligent control, describing the complexity of the biological process in a qualitative and subjective manner as perceived by human operator, is an efficient control strategy for this kind of bioprocesses. In order to simulate the bioprocess evolution, an intelligent control structure, based on fuzzy logic has been designed. The specific objective is to present a fuzzy control approach, based on human expert’ rules vs. a modeling approach of the cells growth based on bioprocess experimental data. The kinetic modeling may represent only a small number of bioprocesses for overall biosystem behavior while fuzzy control system (FCS) can manipulate incomplete and uncertain information about the process assuring high control performance and provides an alternative solution to non-linear control as it is closer to the real world. Due to the high degree of non-linearity and time variance of bioprocesses, the need of control mechanism arises. BIOSIM, an original developed software package, implements such a control structure. The simulation study has showed that the fuzzy technique is quite appropriate for this non-linear, time-varying system vs. the classical control method based on a priori model.Keywords: intelligent, control, fuzzy model, bioprocess optimization
Procedia PDF Downloads 3284884 Effect of Wettability Alteration on Production Performance in Unconventional Tight Oil Reservoirs
Authors: Rashid S. Mohammad, Shicheng Zhang, Xinzhe Zhao
Abstract:
In tight oil reservoirs, wettability alteration has generally been considered as an effective way to remove fracturing fluid retention on the surface of the fracture and consequently improved oil production. However, there is a lack of a reliable productivity prediction model to show the relationship between the wettability and oil production in tight oil well. In this paper, a new oil productivity prediction model of immiscible oil-water flow and miscible CO₂-oil flow accounting for wettability is developed. This mathematical model is established by considering two different length scales: nonporous network and propped fractures. CO₂ flow diffuses in the nonporous network and high velocity non-Darcy flow in propped fractures are considered by taking into account the effect of wettability alteration on capillary pressure and relative permeability. A laboratory experiment is also conducted here to validate this model. Laboratory experiments have been designed to compare the water saturation profiles for different contact angle, revealing the fluid retention in rock pores that affects capillary force and relative permeability. Four kinds of brines with different concentrations are selected here to create different contact angles. In water-wet porous media, as the system becomes more oil-wet, water saturation decreases. As a result, oil relative permeability increases. On the other hand, capillary pressure which is the resistance for the oil flow increases as well. The oil production change due to wettability alteration is the result of the comprehensive changes of oil relative permeability and capillary pressure. The results indicate that wettability is a key factor for fracturing fluid retention removal and oil enhancement in tight reservoirs. By incorporating laboratory test into a mathematical model, this work shows the relationship between wettability and oil production is not a simple linear pattern but a parabolic one. Additionally, it can be used for a better understanding of optimization design of fracturing fluids.Keywords: wettability, relative permeability, fluid retention, oil production, unconventional and tight reservoirs
Procedia PDF Downloads 2364883 Data-Driven Simulations Tools for Der and Battery Rich Power Grids
Authors: Ali Moradiamani, Samaneh Sadat Sajjadi, Mahdi Jalili
Abstract:
Power system analysis has been a major research topic in the generation and distribution sections, in both industry and academia, for a long time. Several load flow and fault analysis scenarios have been normally performed to study the performance of different parts of the grid in the context of, for example, voltage and frequency control. Software tools, such as PSCAD, PSSE, and PowerFactory DIgSILENT, have been developed to perform these analyses accurately. Distribution grid had been the passive part of the grid and had been known as the grid of consumers. However, a significant paradigm shift has happened with the emergence of Distributed Energy Resources (DERs) in the distribution level. It means that the concept of power system analysis needs to be extended to the distribution grid, especially considering self sufficient technologies such as microgrids. Compared to the generation and transmission levels, the distribution level includes significantly more generation/consumption nodes thanks to PV rooftop solar generation and battery energy storage systems. In addition, different consumption profile is expected from household residents resulting in a diverse set of scenarios. Emergence of electric vehicles will absolutely make the environment more complicated considering their charging (and possibly discharging) requirements. These complexities, as well as the large size of distribution grids, create challenges for the available power system analysis software. In this paper, we study the requirements of simulation tools in the distribution grid and how data-driven algorithms are required to increase the accuracy of the simulation results.Keywords: smart grids, distributed energy resources, electric vehicles, battery storage systsms, simulation tools
Procedia PDF Downloads 1064882 A Survey on Compression Methods for Table Constraints
Authors: N. Gharbi
Abstract:
Constraint Satisfaction problems are mathematical problems that are often used to model many real-world problems for which we look if there exists a solution satisfying all its constraints. Table constraints are important for modeling parts of many problems since they list all combinations of allowed or forbidden values. However, they admit practical limitations because they are sometimes too large to be represented in a direct way. In this paper, we present a survey of the different categories of the proposed approaches to compress table constraints in order to reduce both space and time complexities.Keywords: constraint programming, compression, data mining, table constraints
Procedia PDF Downloads 3264881 Data Transformations in Data Envelopment Analysis
Authors: Mansour Mohammadpour
Abstract:
Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.Keywords: data transformation, data envelopment analysis, undesirable data, negative data
Procedia PDF Downloads 274880 Tracing Digital Traces of Phatic Communion in #Mooc
Authors: Judith Enriquez-Gibson
Abstract:
This paper meddles with the notion of phatic communion introduced 90 years ago by Malinowski, who was a Polish-born British anthropologist. It explores the phatic in Twitter within the contents of tweets related to moocs (massive online open courses) as a topic or trend. It is not about moocs though. It is about practices that could easily be hidden or neglected if we let big or massive topics take the lead or if we simply follow the computational or secret codes behind Twitter itself and third party software analytics. It draws from media and cultural studies. Though at first it appears data-driven as I submitted data collection and analytics into the hands of a third party software, Twitonomy, the aim is to follow how phatic communion might be practised in a social media site, such as Twitter. Lurking becomes its research method to analyse mooc-related tweets. A total of 3,000 tweets were collected on 11 October 2013 (UK timezone). The emphasis of lurking is to engage with Twitter as a system of connectivity. One interesting finding is that a click is in fact a phatic practice. A click breaks the silence. A click in one of the mooc website is actually a tweet. A tweet was posted on behalf of a user who simply chose to click without formulating the text and perhaps without knowing that it contains #mooc. Surely, this mechanism is not about reciprocity. To break the silence, users did not use words. They just clicked the ‘tweet button’ on a mooc website. A click performs and maintains connectivity – and Twitter as the medium in attendance in our everyday, available when needed to be of service. In conclusion, the phatic culture of breaking silence in Twitter does not have to submit to the power of code and analytics. It is a matter of human code.Keywords: click, Twitter, phatic communion, social media data, mooc
Procedia PDF Downloads 4144879 Mathematical Modelling of Bacterial Growth in Products of Animal Origin in Storage and Transport: Effects of Temperature, Use of Bacteriocins and pH Level
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cordova
Abstract:
The pathogen growth in animal source foods is a common problem in the food industry, causing monetary losses due to the spoiling of products or food intoxication outbreaks in the community. In this sense, the quality of the product is reflected by the population of deteriorating agents present in it, which are mainly bacteria. The factors which are likely associated with freshness in animal source foods are temperature and processing, storage, and transport times. However, the level of deterioration of products depends, in turn, on the characteristics of the bacterial population, causing the decomposition or spoiling, such as pH level and toxins. Knowing the growth dynamics of the agents that are involved in product contamination allows the monitoring for more efficient processing. This means better quality and reasonable costs, along with a better estimation of necessary time and temperature intervals for transport and storage in order to preserve product quality. The objective of this project is to design a secondary model that allows measuring the impact on temperature bacterial growth and the competition for pH adequacy and release of bacteriocins in order to describe such phenomenon and, thus, estimate food product half-life with the least possible risk of deterioration or spoiling. In order to achieve this objective, the authors propose an analysis of a three-dimensional ordinary differential which includes; logistic bacterial growth extended by the inhibitory action of bacteriocins including the effect of the medium pH; change in the medium pH levels through an adaptation of the Luedeking-Piret kinetic model; Bacteriocin concentration modeled similarly to pH levels. These three dimensions are being influenced by the temperature at all times. Then, this differential system is expanded, taking into consideration the variable temperature and the concentration of pulsed bacteriocins, which represent characteristics inherent of the modeling, such as transport and storage, as well as the incorporation of substances that inhibit bacterial growth. The main results lead to the fact that temperature changes in an early stage of transport increased the bacterial population significantly more than if it had increased during the final stage. On the other hand, the incorporation of bacteriocins, as in other investigations, proved to be efficient in the short and medium-term since, although the population of bacteria decreased, once the bacteriocins were depleted or degraded over time, the bacteria eventually returned to their regular growth rate. The efficacy of the bacteriocins at low temperatures decreased slightly, which equates with the fact that their natural degradation rate also decreased. In summary, the implementation of the mathematical model allowed the simulation of a set of possible bacteria present in animal based products, along with their properties, in various transport and storage situations, which led us to state that for inhibiting bacterial growth, the optimum is complementary low constant temperatures and the initial use of bacteriocins.Keywords: bacterial growth, bacteriocins, mathematical modelling, temperature
Procedia PDF Downloads 1394878 Geometric Design to Improve the Temperature
Authors: H. Ghodbane, A. A. Taleb, O. Kraa
Abstract:
This paper presents geometric design of induction heating system. The objective of this design is to improve the temperature distribution in the load. The study of such a device requires the use of models or modeling representation, physical, mathematical, and numerical. This modeling is the basis of the understanding, the design, and optimization of these systems. The optimization technique is to find values of variables that maximize or minimize the objective function.Keywords: optimization, modeling, geometric design system, temperature increase
Procedia PDF Downloads 5334877 Fast Switching Mechanism for Multicasting Failure in OpenFlow Networks
Authors: Alaa Allakany, Koji Okamura
Abstract:
Multicast technology is an efficient and scalable technology for data distribution in order to optimize network resources. However, in the IP network, the responsibility for management of multicast groups is distributed among network routers, which causes some limitations such as delays in processing group events, high bandwidth consumption and redundant tree calculation. Software Defined Networking (SDN) represented by OpenFlow presented as a solution for many problems, in SDN the control plane and data plane are separated by shifting the control and management to a remote centralized controller, and the routers are used as a forwarder only. In this paper we will proposed fast switching mechanism for solving the problem of link failure in multicast tree based on Tabu Search heuristic algorithm and modifying the functions of OpenFlow switch to fasts switch to the pack up sub tree rather than sending to the controller. In this work we will implement multicasting OpenFlow controller, this centralized controller is a core part in our multicasting approach, which is responsible for 1- constructing the multicast tree, 2- handling the multicast group events and multicast state maintenance. And finally modifying OpenFlow switch functions for fasts switch to pack up paths. Forwarders, forward the multicast packet based on multicast routing entries which were generated by the centralized controller. Tabu search will be used as heuristic algorithm for construction near optimum multicast tree and maintain multicast tree to still near optimum in case of join or leave any members from multicast group (group events).Keywords: multicast tree, software define networks, tabu search, OpenFlow
Procedia PDF Downloads 2644876 Inverse Matrix in the Theory of Dynamical Systems
Authors: Renata Masarova, Bohuslava Juhasova, Martin Juhas, Zuzana Sutova
Abstract:
In dynamic system theory a mathematical model is often used to describe their properties. In order to find a transfer matrix of a dynamic system we need to calculate an inverse matrix. The paper contains the fusion of the classical theory and the procedures used in the theory of automated control for calculating the inverse matrix. The final part of the paper models the given problem by the Matlab.Keywords: dynamic system, transfer matrix, inverse matrix, modeling
Procedia PDF Downloads 5184875 A Theoretical and Experimental Evaluation of a Solar-Powered Off-Grid Air Conditioning System for Residential Buildings
Authors: Adam Y. Sulaiman, Gerard I.Obasi, Roma Chang, Hussein Sayed Moghaieb, Ming J. Huang, Neil J. Hewitt
Abstract:
Residential air-conditioning units are essential for quality indoor comfort in hot climate countries. Nevertheless, because of their non-renewable energy sources and the contribution of ecologically unfriendly working fluids, these units are a major source of CO2 emissions in these countries. The utilisation of sustainable technologies nowadays is essential to reduce the adverse effects of CO2 emissions by replacing conventional technologies. This paper investigates the feasibility of running an off-grid solar-powered air-conditioning bed unit using three low GWP refrigerants (R32, R290, and R600a) to supersede conventional refrigerants.A prototype air conditioning unit was built to supply cold air to a canopy that was connected to it. The assembled unit was designed to distribute cold air to a canopy connected to it. This system is powered by two 400 W photovoltaic panels, with battery storage supplying power to the unit at night-time. Engineering Equation Solver (EES) software is used to mathematically model the vapor compression cycle (VCC) and predict the unit's energetic and exergetic performance. The TRNSYS software was used to simulate the electricity storage performance of the batteries, whereas the IES-VE was used to determine the amount of solar energy required to power the unit. The article provides an analytical design guideline, as well as a comprehensible process system. Combining a renewable energy source to power an AC based-VCC provides an excellent solution to the real problems of high-energy consumption in warm-climate countries.Keywords: air-conditioning, refrigerants, PV panel, energy storages, VCC, exergy
Procedia PDF Downloads 1784874 Real-Time Working Environment Risk Analysis with Smart Textiles
Authors: Jose A. Diaz-Olivares, Nafise Mahdavian, Farhad Abtahi, Kaj Lindecrantz, Abdelakram Hafid, Fernando Seoane
Abstract:
Despite new recommendations and guidelines for the evaluation of occupational risk assessments and their prevention, work-related musculoskeletal disorders are still one of the biggest causes of work activity disruption, productivity loss, sick leave and chronic work disability. It affects millions of workers throughout Europe, with a large-scale economic and social burden. These specific efforts have failed to produce significant results yet, probably due to the limited availability and high costs of occupational risk assessment at work, especially when the methods are complex, consume excessive resources or depend on self-evaluations and observations of poor accuracy. To overcome these limitations, a pervasive system of risk assessment tools in real time has been developed, which has the characteristics of a systematic approach, with good precision, usability and resource efficiency, essential to facilitate the prevention of musculoskeletal disorders in the long term. The system allows the combination of different wearable sensors, placed on different limbs, to be used for data collection and evaluation by a software solution, according to the needs and requirements in each individual working environment. This is done in a non-disruptive manner for both the occupational health expert and the workers. The creation of this solution allows us to attend different research activities that require, as an essential starting point, the recording of data with ergonomic value of very diverse origin, especially in real work environments. The software platform is here presented with a complimentary smart clothing system for data acquisition, comprised of a T-shirt containing inertial measurement units (IMU), a vest sensorized with textile electronics, a wireless electrocardiogram (ECG) and thoracic electrical bio-impedance (TEB) recorder and a glove sensorized with variable resistors, dependent on the angular position of the wrist. The collected data is processed in real-time through a mobile application software solution, implemented in commercially available Android-based smartphones and tablet platforms. Based on the collection of this information and its analysis, real-time risk assessment and feedback about postural improvement is possible, adapted to different contexts. The result is a tool which provides added value to ergonomists and occupational health agents, as in situ analysis of postural behavior can assist in a quantitative manner in the evaluation of work techniques and the occupational environment.Keywords: ergonomics, mobile technologies, risk assessment, smart textiles
Procedia PDF Downloads 1194873 Determination of the Water Needs of Some Crops Irrigated with Treated Water from the Sidi Khouiled Wastewater Treatment Plant in Ouargla, Algeria
Authors: Dalila Oulhaci, Mehdi Benlarbi, Mohammed Zahaf
Abstract:
The irrigation method is fundamental for maintaining a wet bulb around the roots of the crop. This is the case with localized irrigation, where soil moisture can be maintained permanently around the root system between the two water content extremes. Also, one of the oldest methods used since Roman times throughout North Africa and the Near East is based on the frequent dumping of water into porous pottery vases buried in the ground. In this context, these two techniques have been combined by replacing the pottery vase with plastic bottles filled with sand that discharge water through their perforated walls into the surrounding soil. The first objective of this work is the theoretical determination using CLIMWAT and CROPWAT software of the irrigation doses of some crops (palm, wheat, and onion) and experimental by measuring the humidity of the soil before and after watering. The second objective is to determine the purifying power of the sand filter in the bottle. Based on the CROPWAT software results, the date palm needs 18.5 mm in the third decade of December, 57.2 mm in January, and 73.7 mm in February, whereas the doses received by experimentally determined by means of soil moisture before and after irrigation are 19.5 mm respectively, 79.66 mm and 95.66 mm. The onion needs 14.3 mm in the third decade of December of, 59.1 mm in January, and 80 mm in February, whereas the experimental dose received is 15.07 mm, respectively, 64.54 and 86.8 mm. The total requirements for the vegetative period are estimated at 1642.6 mm for date palms, 277.4 mm for wheat, and 193.5 mm for onions. The removal rate of the majority of pollutants from the bottle is 80%. This work covers, on the one hand, the context of water conservation, sustainable development, and protection of the environment, and on the other, the agricultural field.Keywords: irrigation, sand, filter, humidity, bottle
Procedia PDF Downloads 684872 Investigating the Feasibility of Promoting Safety in Civil Projects by BIM System Using Fuzzy Logic
Authors: Mohammad Reza Zamanian
Abstract:
The construction industry has always been recognized as one of the most dangerous available industries, and the statistics of accidents and injuries resulting from it say that the safety category needs more attention and the arrival of up-to-date technologies in this field. Building information modeling (BIM) is one of the relatively new and applicable technologies in Iran, that the necessity of using it is increasingly evident. The main purposes of this research are to evaluate the feasibility of using this technology in the safety sector of construction projects and to evaluate the effectiveness and operationality of its various applications in this sector. These applications were collected and categorized after reviewing past studies and researches then a questionnaire based on Delphi method criteria was presented to 30 experts who were thoroughly familiar with modeling software and safety guidelines. After receiving and exporting the answers to SPSS software, the validity and reliability of the questionnaire were assessed to evaluate the measuring tools. Fuzzy logic is a good way to analyze data because of its flexibility in dealing with ambiguity and uncertainty issues, and the implementation of the Delphi method in the fuzzy environment overcomes the uncertainties in decision making. Therefore, this method was used for data analysis, and the results indicate the usefulness and effectiveness of BIM in projects and improvement of safety status at different stages of construction. Finally, the applications and the sections discussed were ranked in order of priority for efficiency and effectiveness. Safety planning is considered as the most influential part of the safety of BIM among the four sectors discussed, and planning for the installation of protective fences and barriers to prevent falls and site layout planning with a safety approach based on a 3D model are the most important applications of BIM among the 18 applications to improve the safety of construction projects.Keywords: building information modeling, safety of construction projects, Delphi method, fuzzy logic
Procedia PDF Downloads 1694871 An Integrated Mathematical Approach to Measure the Capacity of MMTS
Authors: Bayan Bevrani, Robert L. Burdett, Prasad K. D. V. Yarlagadda
Abstract:
This article focuses upon multi-modal transportation systems (MMTS) and the issues surrounding the determination of system capacity. For that purpose a multi-objective framework is advocated that integrates all the different modes and many different competing capacity objectives. This framework is analytical in nature and facilitates a variety of capacity querying and capacity expansion planning.Keywords: analytical model, capacity analysis, capacity query, multi-modal transportation system (MMTS)
Procedia PDF Downloads 3634870 Implementing 3D Printing for 3D Digital Modeling in the Classroom
Authors: Saritdikhun Somasa
Abstract:
3D printing fabrication has empowered many artists in many fields. Artists who work in stop motion, 3D modeling, toy design, product design, sculpture, and fine arts become one-stop shop operations–where they can design, prototype, and distribute their designs for commercial or fine art purposes. The author has developed a digital sculpting course that fosters digital software, peripheral hardware, and 3D printing with traditional sculpting concept techniques to address the complexities of this multifaceted process, allowing the students to produce complex 3d-printed work. The author will detail the preparation and planning for pre- to post-process 3D printing elements, including software, materials, space, equipment, tools, and schedule consideration for small to medium figurine design statues in a semester-long class. In addition, the author provides insight into teaching challenges in the non-studio space that requires students to work intensively on post-printed models to assemble parts, finish, and refine the 3D printed surface. Even though this paper focuses on the 3D printing processes and techniques for small to medium design statue projects for the Digital Media program, the author hopes the paper will benefit other fields of study such as craft practices, product design, and fine-arts programs. Other schools that might implement 3D printing and fabrication in their programs will find helpful information in this paper, such as a teaching plan, choices of equipment and materials, adaptation for non-studio spaces, and putting together a complete and well-resolved project for students.Keywords: 3D digital modeling, 3D digital sculpting, 3D modeling, 3D printing, 3D digital fabrication
Procedia PDF Downloads 1064869 Numerical Simulation of a Combined Impact of Cooling and Ventilation on the Indoor Environmental Quality
Authors: Matjaz Prek
Abstract:
Impact of three different combinations of cooling and ventilation systems on the indoor environmental quality (IEQ) has been studied. Comparison of chilled ceiling cooling in combination with displacement ventilation, cooling with fan coil unit and cooling with flat wall displacement outlets was performed. All three combinations were evaluated from the standpoint of whole-body and local thermal comfort criteria as well as from the standpoint of ventilation effectiveness. The comparison was made on the basis of numerical simulation with DesignBuilder and Fluent. Numerical simulations were carried out in two steps. Firstly the DesignBuilder software environment was used to model the buildings thermal performance and evaluation of the interaction between the environment and the building. Heat gains of the building and of the individual space, as well as the heat loss on the boundary surfaces in the room, were calculated. In the second step Fluent software environment was used to simulate the response of the indoor environment, evaluating the interaction between building and human, using the simulation results obtained in the first step. Among the systems presented, the ceiling cooling system in combination with displacement ventilation was found to be the most suitable as it offers a high level of thermal comfort with adequate ventilation efficiency. Fan coil cooling has proved inadequate from the standpoint of thermal comfort whereas flat wall displacement outlets were inadequate from the standpoint of ventilation effectiveness. The study showed the need in evaluating indoor environment not solely from the energy use point of view, but from the point of view of indoor environmental quality as well.Keywords: cooling, ventilation, thermal comfort, ventilation effectiveness, indoor environmental quality, IEQ, computational fluid dynamics
Procedia PDF Downloads 1884868 Creation of an Integrated Development Environment to Assist and Optimize the Learning the Languages C and C++
Authors: Francimar Alves, Marcos Castro, Marllus Lustosa
Abstract:
In the context of the teaching of computer programming, the choice of tool to use is very important in the initiation and continuity of learning a programming language. The literature tools do not always provide usability and pedagogical dynamism clearly and accurately for effective learning. This hypothesis implies fall in productivity and difficulty of learning a particular programming language by students. The integrated development environments (IDEs) Dev-C ++ and Code :: Blocks are widely used in introductory courses for undergraduate courses in Computer Science for learning C and C ++ languages. However, after several years of discontinuity maintaining the source code of Dev-C ++ tool, the continued use of the same in the teaching and learning process of the students of these institutions has led to difficulties, mainly due to the lack of update by the official developers, which resulted in a sequence of problems in using it on educational settings. Much of the users, dissatisfied with the IDE Dev-C ++, migrated to Code :: Blocks platform targeting the more dynamic range in the learning process of the C and C ++ languages. Nevertheless, there is still the need to create a tool that can provide the resources of most IDE's software development literature, however, more interactive, simple, accurate and efficient. This motivation led to the creation of Falcon C ++ tool, IDE that brings with features that turn it into an educational platform, which focuses primarily on increasing student learning index in the early disciplines of programming and algorithms that use the languages C and C ++ . As a working methodology, a field research to prove the truth of the proposed tool was used. The test results and interviews with entry-level students and intermediate in a postsecondary institution gave basis for the composition of this work, demonstrating a positive impact on the use of the tool in teaching programming, showing that the use of Falcon C ++ software is beneficial in the teaching process of the C and C ++ programming languages.Keywords: ide, education, learning, development, language
Procedia PDF Downloads 4464867 Applying the CA Systems in Education Process
Authors: A. Javorova, M. Matusova, K. Velisek
Abstract:
The article summarizes the experience of laboratory technical subjects teaching methodologies using a number of software products. The main aim is to modernize the teaching process in accordance with the requirements of today - based on information technology. Increasing of the study attractiveness and effectiveness is due to the introduction of CA technologies in the learning process. This paper discussed the areas where individual CA system used. Environment using CA systems are briefly presented in each chapter.Keywords: education, CA systems, simulation, technology
Procedia PDF Downloads 3984866 Seismic Assessment of Passive Control Steel Structure with Modified Parameter of Oil Damper
Authors: Ahmad Naqi
Abstract:
Today, the passively controlled buildings are extensively becoming popular due to its excellent lateral load resistance circumstance. Typically, these buildings are enhanced with a damping device that has high market demand. Some manufacturer falsified the damping device parameter during the production to achieve the market demand. Therefore, this paper evaluates the seismic performance of buildings equipped with damping devices, which their parameter modified to simulate the falsified devices, intentionally. For this purpose, three benchmark buildings of 4-, 10-, and 20-story were selected from JSSI (Japan Society of Seismic Isolation) manual. The buildings are special moment resisting steel frame with oil damper in the longitudinal direction only. For each benchmark buildings, two types of structural elements are designed to resist the lateral load with and without damping devices (hereafter, known as Trimmed & Conventional Building). The target building was modeled using STERA-3D, a finite element based software coded for study purpose. Practicing the software one can develop either three-dimensional Model (3DM) or Lumped Mass model (LMM). Firstly, the seismic performance of 3DM and LMM models was evaluated and found excellent coincide for the target buildings. The simplified model of LMM used in this study to produce 66 cases for both of the buildings. Then, the device parameters were modified by ± 40% and ±20% to predict many possible conditions of falsification. It is verified that the building which is design to sustain the lateral load with support of damping device (Trimmed Building) are much more under threat as a result of device falsification than those building strengthen by damping device (Conventional Building).Keywords: passive control system, oil damper, seismic assessment, lumped mass model
Procedia PDF Downloads 1164865 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling
Authors: Masoud Safdari, Jacob Fish
Abstract:
Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.Keywords: atomistic, continuum, coupling, multiscale
Procedia PDF Downloads 1784864 Effect of Threshold Configuration on Accuracy in Upper Airway Analysis Using Cone Beam Computed Tomography
Authors: Saba Fahham, Supak Ngamsom, Suchaya Damrongsri
Abstract:
Objective: The objective is to determine the optimal threshold of Romexis software for the airway volume and minimum cross-section area (MCA) analysis using Image J as a gold standard. Materials and Methods: A total of ten cone-beam computed tomography (CBCT) images were collected. The airway volume and MCA of each patient were analyzed using the automatic airway segmentation function in the CBCT DICOM viewer (Romexis). Airway volume and MCA measurements were conducted on each CBCT sagittal view with fifteen different threshold values from the Romexis software, Ranging from 300 to 1000. Duplicate DICOM files, in axial view, were imported into Image J for concurrent airway volume and MCA analysis as the gold standard. The airway volume and MCA measured from Romexis and Image J were compared using a t-test with Bonferroni correction, and statistical significance was set at p<0.003. Results: Concerning airway volume, thresholds of 600 to 850 as well as 1000, exhibited results that were not significantly distinct from those obtained through Image J. Regarding MCA, employing thresholds from 400 to 850 within Romexis Viewer showed no variance from Image J. Notably, within the threshold range of 600 to 850, there were no statistically significant differences observed in both airway volume and MCA analyses, in comparison to Image J. Conclusion: This study demonstrated that the utilization of Planmeca Romexis Viewer 6.4.3.3 within threshold range of 600 to 850 yields airway volume and MCA measurements that exhibit no statistically significant variance in comparison to measurements obtained through Image J. This outcome holds implications for diagnosing upper airway obstructions and post-orthodontic surgical monitoring.Keywords: airway analysis, airway segmentation, cone beam computed tomography, threshold
Procedia PDF Downloads 47