Search results for: AIMMS mathematical software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6390

Search results for: AIMMS mathematical software

5100 Comparative Study of Dose Calculation Accuracy in Bone Marrow Using Monte Carlo Method

Authors: Marzieh Jafarzadeh, Fatemeh Rezaee

Abstract:

Introduction: The effect of ionizing radiation on human health can be effective for genomic integrity and cell viability. It also increases the risk of cancer and malignancy. Therefore, X-ray behavior and absorption dose calculation are considered. One of the applicable tools for calculating and evaluating the absorption dose in human tissues is Monte Carlo simulation. Monte Carlo offers a straightforward way to simulate and integrate, and because it is simple and straightforward, Monte Carlo is easy to use. The Monte Carlo BEAMnrc code is one of the most common diagnostic X-ray simulation codes used in this study. Method: In one of the understudy hospitals, a certain number of CT scan images of patients who had previously been imaged were extracted from the hospital database. BEAMnrc software was used for simulation. The simulation of the head of the device with the energy of 0.09 MeV with 500 million particles was performed, and the output data obtained from the simulation was applied for phantom construction using CT CREATE software. The percentage of depth dose (PDD) was calculated using STATE DOSE was then compared with international standard values. Results and Discussion: The ratio of surface dose to depth dose (D/Ds) in the measured energy was estimated to be about 4% to 8% for bone and 3% to 7% for bone marrow. Conclusion: MC simulation is an efficient and accurate method for simulating bone marrow and calculating the absorbed dose.

Keywords: Monte Carlo, absorption dose, BEAMnrc, bone marrow

Procedia PDF Downloads 214
5099 Effects of the Fractional Order on Nanoparticles in Blood Flow through the Stenosed Artery

Authors: Mohammed Abdulhameed, Sagir M. Abdullahi

Abstract:

In this paper, based on the applications of nanoparticle, the blood flow along with nanoparticles through stenosed artery is studied. The blood is acted by periodic body acceleration, an oscillating pressure gradient and an external magnetic field. The mathematical formulation is based on Caputo-Fabrizio fractional derivative without singular kernel. The model of ordinary blood, corresponding to time-derivatives of integer order, is obtained as a limiting case. Analytical solutions of the blood velocity and temperature distribution are obtained by means of the Hankel and Laplace transforms. Effects of the order of Caputo-Fabrizio time-fractional derivatives and three different nanoparticles i.e. Fe3O4, TiO4 and Cu are studied. The results highlights that, models with fractional derivatives bring significant differences compared to the ordinary model. It is observed that the addition of Fe3O4 nanoparticle reduced the resistance impedance of the blood flow and temperature distribution through bell shape stenosed arteries as compared to TiO4 and Cu nanoparticles. On entering in the stenosed area, blood temperature increases slightly, but, increases considerably and reaches its maximum value in the stenosis throat. The shears stress has variation from a constant in the area without stenosis and higher in the layers located far to the longitudinal axis of the artery. This fact can be an important for some clinical applications in therapeutic procedures.

Keywords: nanoparticles, blood flow, stenosed artery, mathematical models

Procedia PDF Downloads 267
5098 Optimization of Wind Off-Grid System for Remote Area: Egyptian Application

Authors: Marwa M. Ibrahim

Abstract:

The objective of this research is to study the technical and economic performance of wind/diesel/battery (W/D/B) off-grid system supplying a small remote gathering of four families using the HOMER software package. The second objective is to study the effect of wind energy system on the cost of generated electricity considering the cost of reducing CO₂ emissions as external benefit of wind turbines, no pollutant emission through the operational phase. The system consists of a small wind turbine, battery storage, and diesel generator. The electrical energy is to cater to the basic needs for which the daily load pattern is estimated at 8 kW peak. Net Present Cost (NPC) and Cost of Energy (COE) are used as economic criteria, while the measure of performance is % of power shortage. Technical and economic parameters are defined to estimate the feasibility of the system under study. Optimum system configurations are estimated for the selected site in Egypt. Using HOMER software, the simulation results shows that W/D/B systems are economical for the assumed community site as the price of generated electricity is about 0.285 $/kWh, without taking external benefits into considerations and 0.221 if CO₂ emissions taken into consideration W/D/B systems are more economical than alone diesel system as the COE is 0.432 $/kWh for diesel alone.

Keywords: renewable energy, hybrid energy system, on-off grid system, simulation, optimization and environmental impacts

Procedia PDF Downloads 106
5097 Evaluation of Numerical Modeling of Jet Grouting Design Using in situ Loading Test

Authors: Reza Ziaie Moayed, Ehsan Azini

Abstract:

Jet grouting (JG) is one of the methods of improving and increasing the strength and bearing of soil in which the high pressure water or grout is injected through the nozzles into the soil. During this process, a part of the soil and grout particles comes out of the drill borehole, and the other part is mixed up with the grout in place, as a result of this process, a mass of modified soil is created. The purpose of this method is to change the soil into a mixture of soil and cement, commonly known as "soil-cement". In this paper, first, the principles of high pressure injection and then the effective parameters in the JG method are described. Then, the tests on the samples taken from the columns formed from the excavation around the soil-cement columns, as well as the static loading test on the created column, are discussed. In the other part of this paper, the soil behavior models for numerical modeling in PLAXIS software are mentioned. The purpose of this paper is to evaluate the results of numerical modeling based on in-situ static loading tests. The results indicate an acceptable agreement between the results of the tests mentioned and the modeling results. Also, modeling with this software as an appropriate option for technical feasibility can be used to soil improvement using JG.

Keywords: jet grouting column, soil improvement, numerical modeling, in-situ loading test

Procedia PDF Downloads 145
5096 Prediction of Compressive Strength of Concrete from Early Age Test Result Using Design of Experiments (Rsm)

Authors: Salem Alsanusi, Loubna Bentaher

Abstract:

Response Surface Methods (RSM) provide statistically validated predictive models that can then be manipulated for finding optimal process configurations. Variation transmitted to responses from poorly controlled process factors can be accounted for by the mathematical technique of propagation of error (POE), which facilitates ‘finding the flats’ on the surfaces generated by RSM. The dual response approach to RSM captures the standard deviation of the output as well as the average. It accounts for unknown sources of variation. Dual response plus propagation of error (POE) provides a more useful model of overall response variation. In our case, we implemented this technique in predicting compressive strength of concrete of 28 days in age. Since 28 days is quite time consuming, while it is important to ensure the quality control process. This paper investigates the potential of using design of experiments (DOE-RSM) to predict the compressive strength of concrete at 28th day. Data used for this study was carried out from experiment schemes at university of Benghazi, civil engineering department. A total of 114 sets of data were implemented. ACI mix design method was utilized for the mix design. No admixtures were used, only the main concrete mix constituents such as cement, coarse-aggregate, fine aggregate and water were utilized in all mixes. Different mix proportions of the ingredients and different water cement ratio were used. The proposed mathematical models are capable of predicting the required concrete compressive strength of concrete from early ages.

Keywords: mix proportioning, response surface methodology, compressive strength, optimal design

Procedia PDF Downloads 269
5095 Designing a Model for Preparing Reports on the Automatic Earned Value Management Progress by the Integration of Primavera P6, SQL Database, and Power BI: A Case Study of a Six-Storey Concrete Building in Mashhad, Iran

Authors: Hamed Zolfaghari, Mojtaba Kord

Abstract:

Project planners and controllers are frequently faced with the challenge of inadequate software for the preparation of automatic project progress reports based on actual project information updates. They usually make dashboards in Microsoft Excel, which is local and not applicable online. Another shortcoming is that it is not linked to planning software such as Microsoft Project, which lacks the database required for data storage. This study aimed to propose a model for the preparation of reports on automatic online project progress based on actual project information updates by the integration of Primavera P6, SQL database, and Power BI for a construction project. The designed model could be applicable to project planners and controller agents by enabling them to prepare project reports automatically and immediately after updating the project schedule using actual information. To develop the model, the data were entered into P6, and the information was stored on the SQL database. The proposed model could prepare a wide range of reports, such as earned value management, HR reports, and financial, physical, and risk reports automatically on the Power BI application. Furthermore, the reports could be published and shared online.

Keywords: primavera P6, SQL, Power BI, EVM, integration management

Procedia PDF Downloads 110
5094 Documenting the 15th Century Prints with RTI

Authors: Peter Fornaro, Lothar Schmitt

Abstract:

The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.

Keywords: art history, computational photography, paste prints, reflectance transformation imaging

Procedia PDF Downloads 277
5093 A Simple Device for Characterizing High Power Electron Beams for Welding

Authors: Aman Kaur, Colin Ribton, Wamadeva Balachandaran

Abstract:

Electron beam welding due to its inherent advantages is being extensively used for material processing where high precision is required. Especially in aerospace or nuclear industries, there are high quality requirements and the cost of materials and processes is very high which makes it very important to ensure the beam quality is maintained and checked prior to carrying out the welds. Although the processes in these industries are highly controlled, however, even the minor changes in the operating parameters of the electron gun can make large enough variations in the beam quality that can result in poor welding. To measure the beam quality a simple device has been designed that can be used at high powers. The device consists of two slits in x and y axis which collects a small portion of the beam current when the beam is deflected over the slits. The signals received from the device are processed in data acquisition hardware and the dedicated software developed for the device. The device has been used in controlled laboratory environments to analyse the signals and the weld quality relationships by varying the focus current. The results showed matching trends in the weld dimensions and the beam characteristics. Further experimental work is being carried out to determine the ability of the device and signal processing software to detect subtle changes in the beam quality and to relate these to the physical weld quality indicators.

Keywords: electron beam welding, beam quality, high power, weld quality indicators

Procedia PDF Downloads 324
5092 A Multi-Objective Gate Assignment Model Based on Airport Terminal Configuration

Authors: Seyedmirsajad Mokhtarimousavi, Danial Talebi, Hamidreza Asgari

Abstract:

Assigning aircrafts’ activities to appropriate gates is one the most challenging issues in airport authorities’ multiple criteria decision making. The potential financial loss due to imbalances of demand and supply in congested airports, higher occupation rates of gates, and the existing restrictions to expand facilities provide further evidence for the need for an optimal supply allocation. Passengers walking distance, towing movements, extra fuel consumption (as a result of awaiting longer to taxi when taxi conflicts happen at the apron area), etc. are the major traditional components involved in GAP models. In particular, the total cost associated with gate assignment problem highly depends on the airport terminal layout. The study herein presents a well-elaborated literature review on the topic focusing on major concerns, applicable variables and objectives, as well as proposing a three-objective mathematical model for the gate assignment problem. The model has been tested under different concourse layouts in order to check its performance in different scenarios. Results revealed that terminal layout pattern is a significant parameter in airport and that the proposed model is capable of dealing with key constraints and objectives, which supports its practical usability for future decision making tools. Potential solution techniques were also suggested in this study for future works.

Keywords: airport management, terminal layout, gate assignment problem, mathematical modeling

Procedia PDF Downloads 231
5091 Investigation of the Impact of Family Status and Blood Group on Individuals’ Addiction

Authors: Masoud Abbasalipour

Abstract:

In this study, the impact of family status on individuals, involving factors such as parents' literacy level, family size, individuals' blood group, and susceptibility to addiction, was investigated. Statistical tests were employed to scrutinize the relationships among these specified factors. The statistical population of the study consisted of 338 samples divided into two groups: individuals with addiction and those without addiction in the city of Amol. The addicted group was selected from individuals visiting the substance abuse treatment center in Amol, and the non-addicted group was randomly selected from individuals in urban and rural areas. The Chi-square test was used to examine the presence or absence of relationships among the variables, and Kramer's V test was employed to determine the strength of the relationship between them. Excel software facilitated the initial entry of data, and SPSS software was utilized for the desired statistical tests. The research results indicated a significant relationship between the variable of parents' education level and individuals' addiction. The analysis showed that the education level of their parents was significantly lower compared to non-addicted individuals. However, the variables of the number of family members and blood group did not significantly impact individuals' susceptibility to addiction.

Keywords: addiction, blood group, parents' literacy level, family status

Procedia PDF Downloads 70
5090 Evaluation of the Effect of Milk Recording Intervals on the Accuracy of an Empirical Model Fitted to Dairy Sheep Lactations

Authors: L. Guevara, Glória L. S., Corea E. E, A. Ramírez-Zamora M., Salinas-Martinez J. A., Angeles-Hernandez J. C.

Abstract:

Mathematical models are useful for identifying the characteristics of sheep lactation curves to develop and implement improved strategies. However, the accuracy of these models is influenced by factors such as the recording regime, mainly the intervals between test day records (TDR). The current study aimed to evaluate the effect of different TDR intervals on the goodness of fit of the Wood model (WM) applied to dairy sheep lactations. A total of 4,494 weekly TDRs from 156 lactations of dairy crossbred sheep were analyzed. Three new databases were generated from the original weekly TDR data (7D), comprising intervals of 14(14D), 21(21D), and 28(28D) days. The parameters of WM were estimated using the “minpack.lm” package in the R software. The shape of the lactation curve (typical and atypical) was defined based on the WM parameters. The goodness of fit was evaluated using the mean square of prediction error (MSPE), Root of MSPE (RMSPE), Akaike´s Information Criterion (AIC), Bayesian´s Information Criterion (BIC), and the coefficient of correlation (r) between the actual and estimated total milk yield (TMY). WM showed an adequate estimate of TMY regardless of the TDR interval (P=0.21) and shape of the lactation curve (P=0.42). However, we found higher values of r for typical curves compared to atypical curves (0.9vs.0.74), with the highest values for the 28D interval (r=0.95). In the same way, we observed an overestimated peak yield (0.92vs.6.6 l) and underestimated time of peak yield (21.5vs.1.46) in atypical curves. The best values of RMSPE were observed for the 28D interval in both lactation curve shapes. The significant lowest values of AIC (P=0.001) and BIC (P=0.001) were shown by the 7D interval for typical and atypical curves. These results represent the first approach to define the adequate interval to record the regime of dairy sheep in Latin America and showed a better fitting for the Wood model using a 7D interval. However, it is possible to obtain good estimates of TMY using a 28D interval, which reduces the sampling frequency and would save additional costs to dairy sheep producers.

Keywords: gamma incomplete, ewes, shape curves, modeling

Procedia PDF Downloads 79
5089 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 26
5088 THRAP2 Gene Identified as a Candidate Susceptibility Gene of Thyroid Autoimmune Diseases Pedigree in Tunisian Population

Authors: Ghazi Chabchoub, Mouna Feki, Mohamed Abid, Hammadi Ayadi

Abstract:

Autoimmune thyroid diseases (AITDs), including Graves’ disease (GD) and Hashimoto’s thyroiditis (HT), are inherited as complex traits. Genetic factors associated with AITDs have been tentatively identified by candidate gene and genome scanning approaches. We analysed three intragenic microsatellite markers in the thyroid hormone receptor associated protein 2 gene (THRAP2), mapped near D12S79 marker, which have a potential role in immune function and inflammation [THRAP2-1(TG)n, THRAP2-2 (AC)n and THRAP2-3 (AC)n]. Our study population concerned 12 patients affected with AITDs belonging to a multiplex Tunisian family with high prevalence of AITDs. Fluorescent genotyping was carried out on ABI 3100 sequencers (Applied Biosystems USA) with the use of GENESCAN for semi-automated fragment sizing and GENOTYPER peak-calling software. Statistical analysis was performed using the non parametric Lod score (NPL) by Merlin software. Merlin outputs non-parametric NPLall (Z) and LOD scores and their corresponding asymptotic P values. The analysis for three intragenic markers in the THRAP2 gene revealed strong evidence for linkage (NPL=3.68, P=0.00012). Our results suggested the possible role of THRAP2 gene in AITDs susceptibility in this family.

Keywords: autoimmunity, autoimmune disease, genetic, linkage analysis

Procedia PDF Downloads 127
5087 Review of Currently Adopted Intelligent Programming Tutors

Authors: Rita Garcia

Abstract:

Intelligent Programming Tutors, IPTs, are supplemental educational devices that assist in teaching software development. These systems provide customized learning allowing the user to select the presentation pace, pedagogical strategy, and to recall previous and additional teaching materials reinforcing learning objectives. In addition, IPTs automatically records individual’s progress, providing feedback to the instructor and student. These tutoring systems have an advantage over Tutoring Systems because Intelligent Programming Tutors are not limited to one teaching strategy and can adjust when it detects the user struggling with a concept. The Intelligent Programming Tutor is a category of Intelligent Tutoring Systems, ITS. ITS are available for many fields in education, supporting different learning objectives and integrate into other learning tools, improving the student's learning experience. This study provides a comparison of the IPTs currently adopted by the educational community and will focus on the different teaching methodologies and programming languages. The study also includes the ability to integrate the IPT into other educational technologies, such as massive open online courses, MOOCs. The intention of this evaluation is to determine one system that would best serve in a larger ongoing research project and provide findings for other institutions looking to adopt an Intelligent Programming Tutor.

Keywords: computer education tools, integrated software development assistance, intelligent programming tutors, tutoring systems

Procedia PDF Downloads 319
5086 Investigating the Effective Physical Factors in the Development of Coastal Ecotourism in Southern Islands of Iran: A Case Study of Hendurabi Island, Iran

Authors: Zahra Khodaee

Abstract:

Background and Objective: The attractive potential for tourism in the southern islands of Iran, Kish, and Qeshm and recently Hendurabi, are becoming more and more popular and object of increased attention from the investors. The Iranian coral reef islands, with the exception of Kish and Qeshm, have not undergone sufficient development. The southern islands of Iran have faced two problems with climate change and the desire for the presence of tourists. The lack of proper planning, inefficient management, and lack of adequate knowledge of ecosystems of offshore regions have severely damaged the world natural heritage. This study was conducted to consider the correlation of tourism, development, and ecosystem because there is a need for further addressing the ecotourism in coral islands. Method: Through qualitative research, this paper was used of library studies and field studies and surveying to study the physical (objective-subjective) physical factors of ecotourism development in Honduran Island. Using SPSS software and descriptive-analytical method was shown the results. The survey was conducted with the participation of 150 tourists on Kish islands, who were chosen at random and who expressed their desire to travel to Hendurabi Island. Information was gathered using SPSS software and unique statistical T-test. The questionnaire was put together using AMOS software to ensure that the questions asked were sufficiently relevant. Findings: The results of this study presented that physical factors affecting the development of ecotourism in two categories are objective and subjective factors because IFI factor = 0.911 and CFI Factor = 0.907 into the target community. Discussion and conclusion: The results were satisfactory in that they showed that eco-tourists attached importance to see views, quiet, secluded areas, tranquility security, quality of the area being visited, easy access to services these were the top criteria for those visiting the area while they adhere to environmental compliance. Developing Management of these regions should maintain appropriate utilization along with sustainable and ecological responsibility.

Keywords: ecotourism, coral reef island, development management, Hendurabi Island

Procedia PDF Downloads 143
5085 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis

Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha

Abstract:

Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.

Keywords: shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust

Procedia PDF Downloads 128
5084 The Effect of Mathematical Modeling of Damping on the Seismic Energy Demands

Authors: Selamawit Dires, Solomon Tesfamariam, Thomas Tannert

Abstract:

Modern earthquake engineering and design encompass performance-based design philosophy. The main objective in performance-based design is to achieve a system performing precisely to meet the design objectives so to reduce unintended seismic risks and associated losses. Energy-based earthquake-resistant design is one of the design methodologies that can be implemented in performance-based earthquake engineering. In energy-based design, the seismic demand is usually described as the ratio of the hysteretic to input energy. Once the hysteretic energy is known as a percentage of the input energy, it is distributed among energy-dissipating components of a structure. The hysteretic to input energy ratio is highly dependent on the inherent damping of a structural system. In numerical analysis, damping can be modeled as stiffness-proportional, mass-proportional, or a linear combination of stiffness and mass. In this study, the effect of mathematical modeling of damping on the estimation of seismic energy demands is investigated by considering elastic-perfectly-plastic single-degree-of-freedom systems representing short to long period structures. Furthermore, the seismicity of Vancouver, Canada, is used in the nonlinear time history analysis. According to the preliminary results, the input energy demand is not sensitive to the type of damping models deployed. Hence, consistent results are achieved regardless of the damping models utilized in the numerical analyses. On the other hand, the hysteretic to input energy ratios vary significantly for the different damping models.

Keywords: damping, energy-based seismic design, hysteretic energy, input energy

Procedia PDF Downloads 170
5083 Developing Mathematical Relationships to Evaluate the Amount of Added Ease to the Basic Pattern of Weft Knitting Fabrics and Its Fitting to the Upper Part of Egyptian Women's Bodies

Authors: Hebatullah Ali Abdel-Aleem Abdel-Hamid, Camellia Mousa Mohamed Elzean

Abstract:

Knitted garments recently became a key component in wardrobes of the Egyptian woman. Many Egyptian women depend on garments made of knitted fabrics in their outer appearance because of its specific properties including flexibility. Through observation and application, it was noticed that knitwear blocks that used for knitted fabrics somehow does not fit the figures of the Egyptian women. Moreover, the pattern makers are usually confused and unable to choose the suitable blocks for different knitting fabrics taking into consideration its physical and mechanical properties. This study seeks to develop mathematical relationships for evaluation of the amount of added- or subtracted ease to Aldrich’s basic fitting blocks for some weft knitting fabrics and its fitting to the upper part of Egyptian women's bodies. To achieve this goal, 12 samples were used to evaluate fitting of Aldrich’s Basic Fitting Block to the upper part of Egyptian women's bodies. The samples were evaluated before and after alterations, through wear trials on the standard mannequins of size 48 and 56, and judged by experienced assessors using fit evaluation scale. The data obtained were statistically analyzed to identify the efficiency of the adjustments. The Aldrich’s Basic Fitting Block was selected because his method is known internationally and easy to use.

Keywords: Aldrich basic fitting block, clothing industry, knitted fabrics, pattern construction

Procedia PDF Downloads 268
5082 A First Step towards Automatic Evolutionary for Gas Lifts Allocation Optimization

Authors: Younis Elhaddad, Alfonso Ortega

Abstract:

Oil production by means of gas lift is a standard technique in oil production industry. To optimize the total amount of oil production in terms of the amount of gas injected is a key question in this domain. Different methods have been tested to propose a general methodology. Many of them apply well-known numerical methods. Some of them have taken into account the power of evolutionary approaches. Our goal is to provide the experts of the domain with a powerful automatic searching engine into which they can introduce their knowledge in a format close to the one used in their domain, and get solutions comprehensible in the same terms, as well. These proposals introduced in the genetic engine the most expressive formal models to represent the solutions to the problem. These algorithms have proven to be as effective as other genetic systems but more flexible and comfortable for the researcher although they usually require huge search spaces to justify their use due to the computational resources involved in the formal models. The first step to evaluate the viability of applying our approaches to this realm is to fully understand the domain and to select an instance of the problem (gas lift optimization) in which applying genetic approaches could seem promising. After analyzing the state of the art of this topic, we have decided to choose a previous work from the literature that faces the problem by means of numerical methods. This contribution includes details enough to be reproduced and complete data to be carefully analyzed. We have designed a classical, simple genetic algorithm just to try to get the same results and to understand the problem in depth. We could easily incorporate the well mathematical model, and the well data used by the authors and easily translate their mathematical model, to be numerically optimized, into a proper fitness function. We have analyzed the 100 curves they use in their experiment, similar results were observed, in addition, our system has automatically inferred an optimum total amount of injected gas for the field compatible with the addition of the optimum gas injected in each well by them. We have identified several constraints that could be interesting to incorporate to the optimization process but that could be difficult to numerically express. It could be interesting to automatically propose other mathematical models to fit both, individual well curves and also the behaviour of the complete field. All these facts and conclusions justify continuing exploring the viability of applying the approaches more sophisticated previously proposed by our research group.

Keywords: evolutionary automatic programming, gas lift, genetic algorithms, oil production

Procedia PDF Downloads 163
5081 Fabrication of Antimicrobial Dental Model Using Digital Light Processing (DLP) Integrated with 3D-Bioprinting Technology

Authors: Rana Mohamed, Ahmed E. Gomaa, Gehan Safwat, Ayman Diab

Abstract:

Background: Bio-fabrication is a multidisciplinary research field that combines several principles, fabrication techniques, and protocols from different fields. The open-source-software movement is a movement that supports the use of open-source licenses for some or all software as part of the broader notion of open collaboration. Additive manufacturing is the concept of 3D printing, where it is a manufacturing method through adding layer-by-layer using computer-aided designs (CAD). There are several types of AM system used, and they can be categorized by the type of process used. One of these AM technologies is Digital light processing (DLP) which is a 3D printing technology used to rapidly cure a photopolymer resin to create hard scaffolds. DLP uses a projected light source to cure (Harden or crosslinking) the entire layer at once. Current applications of DLP are focused on dental and medical applications. Other developments have been made in this field, leading to the revolutionary field 3D bioprinting. The open-source movement was started to spread the concept of open-source software to provide software or hardware that is cheaper, reliable, and has better quality. Objective: Modification of desktop 3D printer into 3D bio-printer and the integration of DLP technology and bio-fabrication to produce an antibacterial dental model. Method: Modification of a desktop 3D printer into a 3D bioprinter. Gelatin hydrogel and sodium alginate hydrogel were prepared with different concentrations. Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum were extracted, and extractions were selected on different levels (Powder, aqueous extracts, total oils, and Essential oils) prepared for antibacterial bioactivity. Agar well diffusion method along with the E. coli have been used to perform the sensitivity test for the antibacterial activity of the extracts acquired by Zingiber officinale, Syzygium aromaticum, and Allium sativum. Lastly, DLP printing was performed to produce several dental models with the natural extracted combined with hydrogel to represent and simulate the Hard and Soft tissues. Result: The desktop 3D printer was modified into 3D bioprinter using open-source software Marline and modified custom-made 3D printed parts. Sodium alginate hydrogel and gelatin hydrogel were prepared at 5% (w/v), 10% (w/v), and 15%(w/v). Resin integration with the natural extracts of Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum was done following the percentage 1- 3% for each extract. Finally, the Antimicrobial dental model was printed; exhibits the antimicrobial activity, followed by merging with sodium alginate hydrogel. Conclusion: The open-source movement was successful in modifying and producing a low-cost Desktop 3D Bioprinter showing the potential of further enhancement in such scope. Additionally, the potential of integrating the DLP technology with bioprinting is a promising step toward the usage of the antimicrobial activity using natural products.

Keywords: 3D printing, 3D bio-printing, DLP, hydrogel, antibacterial activity, zingiber officinale, syzygium aromaticum, allium sativum, panax ginseng, dental applications

Procedia PDF Downloads 97
5080 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 410
5079 Numerical Modelling and Soil-structure Interaction Analysis of Rigid Ballast-less and Flexible Ballast-based High-speed Rail Track-embankments Using Software

Authors: Tokirhusen Iqbalbhai Shaikh, M. V. Shah

Abstract:

With an increase in travel demand and a reduction in travel time, high-speed rail (HSR) has been introduced in India. Simplified 3-D finite element modelling is necessary to predict the stability and deformation characteristics of railway embankments and soil structure interaction behaviour under high-speed design requirements for Indian soil conditions. The objective of this study is to analyse the rigid ballast-less and flexible ballast-based high speed rail track embankments for various critical conditions subjected to them, viz. static condition, moving train condition, sudden brake application, and derailment case, using software. The input parameters for the analysis are soil type, thickness of the relevant strata, unit weight, Young’s modulus, Poisson’s ratio, undrained cohesion, friction angle, dilatancy angle, modulus of subgrade reaction, design speed, and other anticipated, relevant data. Eurocode 1, IRS-004(D), IS 1343, IRS specifications, California high-speed rail technical specifications, and the NHSRCL feasibility report will be followed in this study.

Keywords: soil structure interaction, high speed rail, numerical modelling, PLAXIS3D

Procedia PDF Downloads 111
5078 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 278
5077 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: electromagnetic sensor, accurately, data acquisition, position measurement

Procedia PDF Downloads 287
5076 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 220
5075 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water

Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer

Abstract:

Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.

Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software

Procedia PDF Downloads 85
5074 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University

Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat

Abstract:

Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.

Keywords: big data platforms, cloudera manager, Hadoop, MapReduce

Procedia PDF Downloads 361
5073 Relevance of Copyright and Trademark in the Gaming Industry

Authors: Deeksha Karunakar

Abstract:

The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.

Keywords: copyright, DMCA, gaming industry, trademark, WIPO

Procedia PDF Downloads 71
5072 In Silico Screening, Identification and Validation of Cryptosporidium hominis Hypothetical Protein and Virtual Screening of Inhibitors as Therapeutics

Authors: Arpit Kumar Shrivastava, Subrat Kumar, Rajani Kanta Mohapatra, Priyadarshi Soumyaranjan Sahu

Abstract:

Computational approaches to predict structure, function and other biological characteristics of proteins are becoming more common in comparison to the traditional methods in drug discovery. Cryptosporidiosis is a major zoonotic diarrheal disease particularly in children, which is caused primarily by Cryptosporidium hominis and Cryptosporidium parvum. Currently, there are no vaccines for cryptosporidiosis and recommended drugs are not effective. With the availability of complete genome sequence of C. hominis, new targets have been recognized for the development of effective and better drugs and/or vaccines. We identified a unique hypothetical epitopic protein in C. hominis genome through BLASTP analysis. A 3D model of the hypothetical protein was generated using I-Tasser server through threading methodology. The quality of the model was validated through Ramachandran plot by PROCHECK server. The functional annotation of the hypothetical protein through DALI server revealed structural similarity with human Transportin 3. Phylogenetic analysis for this hypothetical protein also showed C. hominis hypothetical protein (CUV04613) was the closely related to human transportin 3 protein. The 3D protein model is further subjected to virtual screening study with inhibitors from the Zinc Database by using Dock Blaster software. Docking study reported N-(3-chlorobenzyl) ethane-1,2-diamine as the best inhibitor in terms of docking score. Docking analysis elucidated that Leu 525, Ile 526, Glu 528, Glu 529 are critical residues for ligand–receptor interactions. The molecular dynamic simulation was done to access the reliability of the binding pose of inhibitor and protein complex using GROMACS software at 10ns time point. Trajectories were analyzed at each 2.5 ns time interval, among which, H-bond with LEU-525 and GLY- 530 are significantly present in MD trajectories. Furthermore, antigenic determinants of the protein were determined with the help of DNA Star software. Our study findings showed a great potential in order to provide insights in the development of new drug(s) or vaccine(s) for control as well as prevention of cryptosporidiosis among humans and animals.

Keywords: cryptosporidium hominis, hypothetical protein, molecular docking, molecular dynamics simulation

Procedia PDF Downloads 365
5071 The Relationship between Knowledge Management Processes and Strategic Thinking at the Organization Level

Authors: Bahman Ghaderi, Hedayat Hosseini, Parviz Kafche

Abstract:

The role of knowledge management processes in achieving the strategic goals of organizations is crucial. To this end, understanding the relationship between knowledge management processes and different aspects of strategic thinking (followed by long-term organizational planning) should be considered. This research examines the relationship between each of the five knowledge management processes (creation, storage, transfer, audit, and deployment) with each dimension of strategic thinking (vision, creativity, thinking, communication and analysis) in one of the major sectors of the food industry in Iran. In this research, knowledge management and its dimensions (knowledge acquisition, knowledge storage, knowledge transfer, knowledge auditing, and finally knowledge utilization) as independent variables and strategic thinking and its dimensions (creativity, systematic thinking, vision, strategic analysis, and strategic communication) are considered as the dependent variable. The statistical population of this study consisted of 245 managers and employees of Minoo Food Industrial Group in Tehran. In this study, a simple random sampling method was used, and data were collected by a questionnaire designed by the research team. Data were analyzed using SPSS 21 software. LISERL software is also used for calculating and drawing models and graphs. Among the factors investigated in the present study, knowledge storage with 0.78 had the most effect, and knowledge transfer with 0.62 had the least effect on knowledge management and thus on strategic thinking.

Keywords: knowledge management, strategic thinking, knowledge management processes, food industry

Procedia PDF Downloads 173