Search results for: AIMMS mathematical software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6237

Search results for: AIMMS mathematical software

4977 Relevance of Copyright and Trademark in the Gaming Industry

Authors: Deeksha Karunakar

Abstract:

The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.

Keywords: copyright, DMCA, gaming industry, trademark, WIPO

Procedia PDF Downloads 52
4976 Modelling Phytoremediation Rates of Aquatic Macrophytes in Aquaculture Effluent

Authors: E. A. Kiridi, A. O. Ogunlela

Abstract:

Pollutants from aquacultural practices constitute environmental problems and phytoremediation could offer cheaper environmentally sustainable alternative since equipment using advanced treatment for fish tank effluent is expensive to import, install, operate and maintain, especially in developing countries. The main objective of this research was, therefore, to develop a mathematical model for phytoremediation by aquatic plants in aquaculture wastewater. Other objectives were to evaluate the retention times on phytoremediation rates using the model and to measure the nutrient level of the aquaculture effluent and phytoremediation rates of three aquatic macrophytes, namely; water hyacinth (Eichornia crassippes), water lettuce (Pistial stratoites) and morning glory (Ipomea asarifolia). A completely randomized experimental design was used in the study. Approximately 100 g of each macrophyte were introduced into the hydroponic units and phytoremediation indices monitored at 8 different intervals from the first to the 28th day. The water quality parameters measured were pH and electrical conductivity (EC). Others were concentration of ammonium–nitrogen (NH₄⁺ -N), nitrite- nitrogen (NO₂⁻ -N), nitrate- nitrogen (NO₃⁻ -N), phosphate –phosphorus (PO₄³⁻ -P), and biomass value. The biomass produced by water hyacinth was 438.2 g, 600.7 g, 688.2 g and 725.7 g at four 7–day intervals. The corresponding values for water lettuce were 361.2 g, 498.7 g, 561.2 g and 623.7 g and for morning glory were 417.0 g, 567.0 g, 642.0 g and 679.5g. Coefficient of determination was greater than 80% for EC, TDS, NO₂⁻ -N, NO₃⁻ -N and 70% for NH₄⁺ -N using any of the macrophytes and the predicted values were within the 95% confidence interval of measured values. Therefore, the model is valuable in the design and operation of phytoremediation systems for aquaculture effluent.

Keywords: aquaculture effluent, macrophytes, mathematical model, phytoremediation

Procedia PDF Downloads 205
4975 In Silico Screening, Identification and Validation of Cryptosporidium hominis Hypothetical Protein and Virtual Screening of Inhibitors as Therapeutics

Authors: Arpit Kumar Shrivastava, Subrat Kumar, Rajani Kanta Mohapatra, Priyadarshi Soumyaranjan Sahu

Abstract:

Computational approaches to predict structure, function and other biological characteristics of proteins are becoming more common in comparison to the traditional methods in drug discovery. Cryptosporidiosis is a major zoonotic diarrheal disease particularly in children, which is caused primarily by Cryptosporidium hominis and Cryptosporidium parvum. Currently, there are no vaccines for cryptosporidiosis and recommended drugs are not effective. With the availability of complete genome sequence of C. hominis, new targets have been recognized for the development of effective and better drugs and/or vaccines. We identified a unique hypothetical epitopic protein in C. hominis genome through BLASTP analysis. A 3D model of the hypothetical protein was generated using I-Tasser server through threading methodology. The quality of the model was validated through Ramachandran plot by PROCHECK server. The functional annotation of the hypothetical protein through DALI server revealed structural similarity with human Transportin 3. Phylogenetic analysis for this hypothetical protein also showed C. hominis hypothetical protein (CUV04613) was the closely related to human transportin 3 protein. The 3D protein model is further subjected to virtual screening study with inhibitors from the Zinc Database by using Dock Blaster software. Docking study reported N-(3-chlorobenzyl) ethane-1,2-diamine as the best inhibitor in terms of docking score. Docking analysis elucidated that Leu 525, Ile 526, Glu 528, Glu 529 are critical residues for ligand–receptor interactions. The molecular dynamic simulation was done to access the reliability of the binding pose of inhibitor and protein complex using GROMACS software at 10ns time point. Trajectories were analyzed at each 2.5 ns time interval, among which, H-bond with LEU-525 and GLY- 530 are significantly present in MD trajectories. Furthermore, antigenic determinants of the protein were determined with the help of DNA Star software. Our study findings showed a great potential in order to provide insights in the development of new drug(s) or vaccine(s) for control as well as prevention of cryptosporidiosis among humans and animals.

Keywords: cryptosporidium hominis, hypothetical protein, molecular docking, molecular dynamics simulation

Procedia PDF Downloads 351
4974 The Relationship between Knowledge Management Processes and Strategic Thinking at the Organization Level

Authors: Bahman Ghaderi, Hedayat Hosseini, Parviz Kafche

Abstract:

The role of knowledge management processes in achieving the strategic goals of organizations is crucial. To this end, understanding the relationship between knowledge management processes and different aspects of strategic thinking (followed by long-term organizational planning) should be considered. This research examines the relationship between each of the five knowledge management processes (creation, storage, transfer, audit, and deployment) with each dimension of strategic thinking (vision, creativity, thinking, communication and analysis) in one of the major sectors of the food industry in Iran. In this research, knowledge management and its dimensions (knowledge acquisition, knowledge storage, knowledge transfer, knowledge auditing, and finally knowledge utilization) as independent variables and strategic thinking and its dimensions (creativity, systematic thinking, vision, strategic analysis, and strategic communication) are considered as the dependent variable. The statistical population of this study consisted of 245 managers and employees of Minoo Food Industrial Group in Tehran. In this study, a simple random sampling method was used, and data were collected by a questionnaire designed by the research team. Data were analyzed using SPSS 21 software. LISERL software is also used for calculating and drawing models and graphs. Among the factors investigated in the present study, knowledge storage with 0.78 had the most effect, and knowledge transfer with 0.62 had the least effect on knowledge management and thus on strategic thinking.

Keywords: knowledge management, strategic thinking, knowledge management processes, food industry

Procedia PDF Downloads 155
4973 Mathematical Modelling of Slag Formation in an Entrained-Flow Gasifier

Authors: Girts Zageris, Vadims Geza, Andris Jakovics

Abstract:

Gasification processes are of great interest due to their generation of renewable energy in the form of syngas from biodegradable waste. It is, therefore, important to study the factors that play a role in the efficiency of gasification and the longevity of the machines in which gasification takes place. This study focuses on the latter, aiming to optimize an entrained-flow gasifier by reducing slag formation on its walls to reduce maintenance costs. A CFD mathematical model for an entrained-flow gasifier is constructed – the model of an actual gasifier is rendered in 3D and appropriately meshed. Then, the turbulent gas flow in the gasifier is modeled with the realizable k-ε approach, taking devolatilization, combustion and coal gasification into account. Various such simulations are conducted, obtaining results for different air inlet positions and by tracking particles of varying sizes undergoing devolatilization and gasification. The model identifies potential problematic zones where most particles collide with the gasifier walls, indicating risk regions where ash deposits could most likely form. In conclusion, the effects on the formation of an ash layer of air inlet positioning and particle size allowed in the main gasifier tank are discussed, and possible solutions for decreasing a number of undesirable deposits are proposed. Additionally, an estimate of the impact of different factors such as temperature, gas properties and gas content, and different forces acting on the particles undergoing gasification is given.

Keywords: biomass particles, gasification, slag formation, turbulence k-ε modelling

Procedia PDF Downloads 268
4972 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital

Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri

Abstract:

Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.

Keywords: systems modeling, ED operation, workflow modeling, systems analysis

Procedia PDF Downloads 167
4971 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 120
4970 Characterizing the Rectification Process for Designing Scoliosis Braces: Towards Digital Brace Design

Authors: Inigo Sanz-Pena, Shanika Arachchi, Dilani Dhammika, Sanjaya Mallikarachchi, Jeewantha S. Bandula, Alison H. McGregor, Nicolas Newell

Abstract:

The use of orthotic braces for adolescent idiopathic scoliosis (AIS) patients is the most common non-surgical treatment to prevent deformity progression. The traditional method to create an orthotic brace involves casting the patient’s torso to obtain a representative geometry, which is then rectified by an orthotist to the desired geometry of the brace. Recent improvements in 3D scanning technologies, rectification software, CNC, and additive manufacturing processes have given the possibility to compliment, or in some cases, replace manual methods with digital approaches. However, the rectification process remains dependent on the orthotist’s skills. Therefore, the rectification process needs to be carefully characterized to ensure that braces designed through a digital workflow are as efficient as those created using a manual process. The aim of this study is to compare 3D scans of patients with AIS against 3D scans of both pre- and post-rectified casts that have been manually shaped by an orthotist. Six AIS patients were recruited from the Ragama Rehabilitation Clinic, Colombo, Sri Lanka. All patients were between 10 and 15 years old, were skeletally immature (Risser grade 0-3), and had Cobb angles between 20-45°. Seven spherical markers were placed at key anatomical locations on each patient’s torso and on the pre- and post-rectified molds so that distances could be reliably measured. 3D scans were obtained of 1) the patient’s torso and pelvis, 2) the patient’s pre-rectification plaster mold, and 3) the patient’s post-rectification plaster mold using a Structure Sensor Mark II 3D scanner (Occipital Inc., USA). 3D stick body models were created for each scan to represent the distances between anatomical landmarks. The 3D stick models were used to analyze the changes in position and orientation of the anatomical landmarks between scans using Blender open-source software. 3D Surface deviation maps represented volume differences between the scans using CloudCompare open-source software. The 3D stick body models showed changes in the position and orientation of thorax anatomical landmarks between the patient and the post-rectification scans for all patients. Anatomical landmark position and volume differences were seen between 3D scans of the patient’s torsos and the pre-rectified molds. Between the pre- and post-rectified molds, material removal was consistently seen on the anterior side of the thorax and the lateral areas below the ribcage. Volume differences were seen in areas where the orthotist planned to place pressure pads (usually at the trochanter on the side to which the lumbar curve was tilted (trochanter pad), at the lumbar apical vertebra (lumbar pad), on the rib connected to the apical vertebrae at the mid-axillary line (thoracic pad), and on the ribs corresponding to the upper thoracic vertebra (axillary extension pad)). The rectification process requires the skill and experience of an orthotist; however, this study demonstrates that the brace shape, location, and volume of material removed from the pre-rectification mold can be characterized and quantified. Results from this study can be fed into software that can accelerate the brace design process and make steps towards the automated digital rectification process.

Keywords: additive manufacturing, orthotics, scoliosis brace design, sculpting software, spinal deformity

Procedia PDF Downloads 132
4969 Modeling of Surface Roughness in Hard Turning of DIN 1.2210 Cold Work Tool Steel with Ceramic Tools

Authors: Mehmet Erdi Korkmaz, Mustafa Günay

Abstract:

Nowadays, grinding is frequently replaced with hard turning for reducing set up time and higher accuracy. This paper focused on mathematical modeling of average surface roughness (Ra) in hard turning of AISI L2 grade (DIN 1.2210) cold work tool steel with ceramic tools. The steel was hardened to 60±1 HRC after the heat treatment process. Cutting speed, feed rate, depth of cut and tool nose radius was chosen as the cutting conditions. The uncoated ceramic cutting tools were used in the machining experiments. The machining experiments were performed according to Taguchi L27 orthogonal array on CNC lathe. Ra values were calculated by averaging three roughness values obtained from three different points of machined surface. The influences of cutting conditions on surface roughness were evaluated as statistical and experimental. The analysis of variance (ANOVA) with 95% confidence level was applied for statistical analysis of experimental results. Finally, mathematical models were developed using the artificial neural networks (ANN). ANOVA results show that feed rate is the dominant factor affecting surface roughness, followed by tool nose radius and cutting speed.

Keywords: ANN, hard turning, DIN 1.2210, surface roughness, Taguchi method

Procedia PDF Downloads 354
4968 Improved of Elliptic Curves Cryptography over a Ring

Authors: Abdelhakim Chillali, Abdelhamid Tadmori, Muhammed Ziane

Abstract:

In this article we will study the elliptic curve defined over the ring An and we define the mathematical operations of ECC, which provides a high security and advantage for wireless applications compared to other asymmetric key cryptosystem.

Keywords: elliptic curves, finite ring, cryptography, study

Procedia PDF Downloads 359
4967 An Application of E-Learning Technology for Students with Deafness and Hearing Impairment

Authors: Eyup Bayram Guzel

Abstract:

There have been growing awareness that technology offers unique and promising advantages by offering up-to-data educational materials in promoting teaching and learning materials, new strategies for building enhanced communication environment for people with disabilities and specifically for this study concentrated on the students with deafness and hearing impairments. Creating e-learning environment where teachers and students work in collaboration to develop better educational outcomes is the foremost reason of conducting this research. This study examined the perspectives of special education teachers’ regarding an application of e-learning software called Multimedia Builder on the students with deafness and hearing impairments. Initial and follow up interviews were conducted with 15 special education teachers around the scope of qualitative case study. Grounded approach has been used to analyse and interpret the data. The research results revealed that application of Multimedia Builder software were influential on reading, sign language, vocabulary improvements, computer and ICT usage developments and on audio-visual learning achievements for the advantages of students with deafness and hearing impairments. The implications of the study encouraged the ways of using e-learning tools and strategies to promote unique and comprehensive learning experiences for the targeted students and their teachers.

Keywords: e-learning, special education, deafness and hearing impairment, computer-ICT usage.

Procedia PDF Downloads 423
4966 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform

Procedia PDF Downloads 287
4965 Use of Large Eddy Simulations Model to Simulate the Flow of Heavy Oil-Water-Air through Pipe

Authors: Salim Al Jadidi, Shian Gao, Shivananda Moolya

Abstract:

Computational Fluid Dynamic (CFD) technique coupled with Sub-Grid-Scale (SGS) model is used to study the flow behavior of heavy oil-water-air flow in a horizontal pipe by adapting ANSYS Fluent CFD software. The technique suitable for the transport of water-lubricated heavy viscous oil in a horizontal pipe is the Core Annular flow (CAF) technique. The present study focuses on the numerical study of CAF adapting Large Eddy Simulations (LES). The basic objective of the present study is to gain a basic knowledge of the flow behavior of heavy oil using turbulent CAF through a conventional horizontal pipe. This work also focuses on the success and applicability of LES. The simulation of heavy oil-water-air three-phase flow and two-phase flow of heavy oil–water in a conventional horizontal pipe is performed using ANSYS Fluent 16.2 software. The influence of three-phase heavy oil-water air flow in a selected pipe is affected by gravity. It is also observed from the result that the air phase and the variation in the temperature impact the behavior of the annular stream and pressure drop. Some results obtained during the study are validated with the results gained from part of the literature experiments and simulations, and the results show reasonably good agreement between the studies.

Keywords: computational fluid dynamics, gravity, heavy viscous oil, three-phase flow

Procedia PDF Downloads 63
4964 Design and Application of NFC-Based Identity and Access Management in Cloud Services

Authors: Shin-Jer Yang, Kai-Tai Yang

Abstract:

In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.

Keywords: cloud service, multi-tenancy, NFC, IAM, mobile device

Procedia PDF Downloads 418
4963 Research Project on Learning Rationality in Strategic Behaviors: Interdisciplinary Educational Activities in Italian High Schools

Authors: Giovanna Bimonte, Luigi Senatore, Francesco Saverio Tortoriello, Ilaria Veronesi

Abstract:

The education process considers capabilities not only to be seen as a means to a certain end but rather as an effective purpose. Sen's capability approach challenges human capital theory, which sees education as an ordinary investment undertaken by individuals. A complex reality requires complex thinking capable of interpreting the dynamics of society's changes to be able to make decisions that can be rational for private, ethical and social contexts. Education is not something removed from the cultural and social context; it exists and is structured within it. In Italy, the "Mathematical High School Project" is a didactic research project is based on additional laboratory courses in extracurricular hours where mathematics intends to bring itself in a dialectical relationship with other disciplines as a cultural bridge between the two cultures, the humanistic and the scientific ones, with interdisciplinary educational modules on themes of strong impact in younger life. This interdisciplinary mathematics presents topics related to the most advanced technologies and contemporary socio-economic frameworks to demonstrate how mathematics is not only a key to reading but also a key to resolving complex problems. The recent developments in mathematics provide the potential for profound and highly beneficial changes in mathematics education at all levels, such as in socio-economic decisions. The research project is built to investigate whether repeated interactions can successfully promote cooperation among students as rational choice and if the skill, the context and the school background can influence the strategies choice and the rationality. A Laboratory on Game Theory as mathematical theory was conducted in the 4th year of the Mathematical High Schools and in an ordinary scientific high school of the Scientific degree program. Students played two simultaneous games of repeated Prisoner's Dilemma with an indefinite horizon, with two different competitors in each round; even though the competitors in each round will remain the same for the duration of the game. The results highlight that most of the students in the two classes used the two games with an immunization strategy against the risk of losing: in one of the games, they started by playing Cooperate, and in the other by the strategy of Compete. In the literature, theoretical models and experiments show that in the case of repeated interactions with the same adversary, the optimal cooperation strategy can be achieved by tit-for-tat mechanisms. In higher education, individual capacities cannot be examined independently, as conceptual framework presupposes a social construction of individuals interacting and competing, making individual and collective choices. The paper will outline all the results of the experimentation and the future development of the research.

Keywords: game theory, interdisciplinarity, mathematics education, mathematical high school

Procedia PDF Downloads 56
4962 Modeling the Time Dependent Biodistribution of a 177Lu Labeled Somatostatin Analogues for Targeted Radiotherapy of Neuroendocrine Tumors Using Compartmental Analysis

Authors: Mahdieh Jajroudi

Abstract:

Developing a pharmacokinetic model for the neuroendocrine tumors therapy agent 177Lu-DOTATATE in nude mice bearing AR42J rat pancreatic tumor to investigate and evaluate the behavior of the complex was the main purpose of this study. The utilization of compartmental analysis permits the mathematical differencing of tissues and organs to become acquainted with the concentration of activity in each fraction of interest. Biodistribution studies are onerous and troublesome to perform in humans, but such data can be obtained facilely in rodents. A physiologically based pharmacokinetic model for scaling up activity concentration in particular organs versus time was developed. The mathematical model exerts physiological parameters including organ volumes, blood flow rates, and vascular permabilities; the compartments (organs) are connected anatomically. This allows the use of scale-up techniques to forecast new complex distribution in humans' each organ. The concentration of the radiopharmaceutical in various organs was measured at different times. The temporal behavior of biodistribution of 177Lu labeled somatostatin analogues was modeled and drawn as function of time. Conclusion: The variation of pharmaceutical concentration in all organs is characterized with summation of six to nine exponential terms and it approximates our experimental data with precision better than 1%.

Keywords: biodistribution modeling, compartmental analysis, 177Lu labeled somatostatin analogues, neuroendocrine tumors

Procedia PDF Downloads 347
4961 Time Delayed Susceptible-Vaccinated-Infected-Recovered-Susceptible Epidemic Model along with Nonlinear Incidence and Nonlinear Treatment

Authors: Kanica Goel, Nilam

Abstract:

Infectious diseases are a leading cause of death worldwide and hence a great challenge for every nation. Thus, it becomes utmost essential to prevent and reduce the spread of infectious disease among humans. Mathematical models help to better understand the transmission dynamics and spread of infections. For this purpose, in the present article, we have proposed a nonlinear time-delayed SVIRS (Susceptible-Vaccinated-Infected-Recovered-Susceptible) mathematical model with nonlinear type incidence rate and nonlinear type treatment rate. Analytical study of the model shows that model exhibits two types of equilibrium points, namely, disease-free equilibrium and endemic equilibrium. Further, for the long-term behavior of the model, stability of the model is discussed with the help of basic reproduction number R₀ and we showed that disease-free equilibrium is locally asymptotically stable if the basic reproduction number R₀ is less than one and unstable if the basic reproduction number R₀ is greater than one for the time lag τ≥0. Furthermore, when basic reproduction number R₀ is one, using center manifold theory and Casillo-Chavez and Song theorem, we showed that the model undergoes transcritical bifurcation. Moreover, numerical simulations are being carried out using MATLAB 2012b to illustrate the theoretical results.

Keywords: nonlinear incidence rate, nonlinear treatment rate, stability, time delayed SVIRS epidemic model

Procedia PDF Downloads 138
4960 Knowledge Management in the Interactive Portal for Decision Makers on InKOM Example

Authors: K. Marciniak, M. Owoc

Abstract:

Managers as decision-makers present in different sectors should be supported in efficient and more and more sophisticated way. There are huge number of software tools developed for such users starting from simple registering data from business area – typical for operational level of management – up to intelligent techniques with delivering knowledge - for tactical and strategic levels of management. There is a big challenge for software developers to create intelligent management dashboards allowing to support different decisions. In more advanced solutions there is even an option for selection of intelligent techniques useful for managers in particular decision-making phase in order to deliver valid knowledge-base. Such a tool (called Intelligent Dashboard for SME Managers–InKOM) is prepared in the Business Intelligent framework of Teta products. The aim of the paper is to present solutions assumed for InKOM concerning on management of stored knowledge bases offering for business managers. The paper is managed as follows. After short introduction concerning research context the discussed supporting managers via information systems the InKOM platform is presented. In the crucial part of paper a process of knowledge transformation and validation is demonstrated. We will focus on potential and real ways of knowledge-bases acquiring, storing and validation. It allows for formulation conclusions interesting from knowledge engineering point of view.

Keywords: business intelligence, decision support systems, knowledge management, knowledge transformation, knowledge validation, managerial systems

Procedia PDF Downloads 497
4959 Piql Preservation Services - A Holistic Approach to Digital Long-Term Preservation

Authors: Alexander Rych

Abstract:

Piql Preservation Services (“Piql”) is a turnkey solution designed for secure, migration-free long- term preservation of digital data. Piql sets an open standard for long- term preservation for the future. It consists of equipment and processes needed for writing and retrieving digital data. Exponentially growing amounts of data demand for logistically effective and cost effective processes. Digital storage media (hard disks, magnetic tape) exhibit limited lifetime. Repetitive data migration to overcome rapid obsolescence of hardware and software bears accelerated risk of data loss, data corruption or even manipulation and adds significant repetitive costs for hardware and software investments. Piql stores any kind of data in its digital as well as analog form securely for 500 years. The medium that provides this is a film reel. Using photosensitive film polyester base, a very stable material that is known for its immutability over hundreds of years, secure and cost-effective long- term preservation can be provided. The film reel itself is stored in a packaging capable of protecting the optical storage medium. These components have undergone extensive testing to ensure longevity of up to 500 years. In addition to its durability, film is a true WORM (write once- read many) medium. It therefore is resistant to editing or manipulation. Being able to store any form of data onto the film makes Piql a superior solution for long-term preservation. Paper documents, images, video or audio sequences – all of those file formats and documents can be preserved in its native file structure. In order to restore the encoded digital data, only a film scanner, a digital camera or any appropriate optical reading device will be needed in the future. Every film reel includes an index section describing the data saved on the film. It also contains a content section carrying meta-data, enabling users in the future to rebuild software in order to read and decode the digital information.

Keywords: digital data, long-term preservation, migration-free, photosensitive film

Procedia PDF Downloads 379
4958 Assessment of Rock Masses Performance as a Support of Lined Rock Cavern for Isothermal Compressed Air Energy Storage

Authors: Vathna Suy, Ki-Il Song

Abstract:

In order to store highly pressurized gas such as an isothermal compressed air energy storage, Lined Rock Caverns (LRC) are constructed underground and supported by layers of concrete, steel and rock masses. This study aims to numerically investigate the performance of rock masses which serve as a support of Lined Rock Cavern subjected to high cyclic pressure loadings. FLAC3D finite different software is used for the simulation since the software can effectively model the behavior of concrete lining and steel plate with its built-in structural elements. Cyclic pressure loadings are applied onto the inner surface of the cavern which then transmitted to concrete, steel and eventually to the surrounding rock masses. Changes of stress and strain are constantly monitored throughout all the process of loading operations. The results at various monitoring locations are then extracted and analyzed to assess the response of the rock masses, specifically on its ability to absorb energy during loadings induced by the changes of cyclic pressure loadings inside the cavern. By analyzing the obtained data of stress-strain relation and taking into account the behavior of materials under the effect of strain-dependency, conclusions on the performance of rock masses subjected to high cyclic loading conditions are drawn.

Keywords: cyclic loading, FLAC3D, lined rock cavern (LRC), strain-dependency

Procedia PDF Downloads 233
4957 Tide Contribution in the Flood Event of Jeddah City: Mathematical Modelling and Different Field Measurements of the Groundwater Rise

Authors: Aïssa Rezzoug

Abstract:

This paper is aimed to bring new elements that demonstrate the tide caused the groundwater to rise in the shoreline band, on which the urban areas occurs, especially in the western coastal cities of the Kingdom of Saudi Arabia like Jeddah. The reason for the last events of Jeddah inundation was the groundwater rise in the city coupled at the same time to a strong precipitation event. This paper will illustrate the tide participation in increasing the groundwater level significantly. It shows that the reason for internal groundwater recharge within the urban area is not only the excess of the water supply coming from surrounding areas, due to the human activity, with lack of sufficient and efficient sewage system, but also due to tide effect. The research study follows a quantitative method to assess groundwater level rise risks through many in-situ measurements and mathematical modelling. The proposed approach highlights groundwater level, in the urban areas of the city on the shoreline band, reaching the high tide level without considering any input from precipitation. Despite the small tide in the Red Sea compared to other oceanic coasts, the groundwater level is considerably enhanced by the tide from the seaside and by the freshwater table from the landside of the city. In these conditions, the groundwater level becomes high in the city and prevents the soil to evacuate quickly enough the surface flow caused by the storm event, as it was observed in the last historical flood catastrophe of Jeddah in 2009.

Keywords: flood, groundwater rise, Jeddah, tide

Procedia PDF Downloads 97
4956 Evaluation of the Factors Affecting Violence Against Women (Case Study: Couples Referring to Family Counseling Centers in Tehran)

Authors: Hassan Manouchehri

Abstract:

The present study aimed to identify and evaluate the factors affecting violence against women. The statistical population included all couples referring to family counseling centers in Tehran due to domestic violence during the past year. A number of 305 people were selected as a statistical sample using simple random sampling and Cochran's formula in unlimited conditions. A researcher-made questionnaire including 110 items was used for data collection. The face validity and content validity of the questionnaire were confirmed by 30 experts and its reliability was obtained above 0.7 for all studied variables in a preliminary test with 30 subjects and it was acceptable. In order to analyze the data, descriptive statistical methods were used with SPSS software version 22 and inferential statistics were used for modeling structural equations in Smart PLS software version 2. Evaluating the theoretical framework and domestic and foreign studies indicated that, in general, four main factors, including cultural and social factors, economic factors, legal factors, as well as medical factors, underlie violence against women. In addition, structural equation modeling findings indicated that cultural and social factors, economic factors, legal factors, and medical factors affect violence against women.

Keywords: violence against women, cultural and social factors, economic factors, legal factors, medical factors

Procedia PDF Downloads 122
4955 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 510
4954 Modified Model for UV-Laser Corneal Ablation

Authors: Salah Hassab Elnaby, Omnia Hamdy, Aziza Ahmed Hassan, Salwa Abdelkawi, Ibrahim Abdelhalim

Abstract:

Laser corneal reshaping has been proposed as a successful treatment of many refraction disorders. However, some physical and chemical demonstrations of the laser effect upon interaction with the corneal tissue are still not fully explained. Therefore, different computational and mathematical models have been implemented to predict the depth of the ablated channel and calculate the ablation threshold and the local temperature rise. In the current paper, we present a modified model that aims to answer some of the open questions about the ablation threshold, the ablation rate, and the physical and chemical mechanisms of that action. The proposed model consists of three parts. The first part deals with possible photochemical reactions between the incident photons and various components of the cornea (collagen, water, etc.). Such photochemical reactions may end by photo-ablation or just the electronic excitation of molecules. Then a chemical reaction is responsible for the ablation threshold. Finally, another chemical reaction produces fragments that can be cleared out. The model takes into account all processes at the same time with different probabilities. Moreover, the effect of applying different laser wavelengths that have been studied before, namely the common excimer laser (193-nm) and the solid state lasers (213-nm & 266-nm), has been investigated. Despite the success and ubiquity of the ArF laser, the presented results reveal that a carefully designed 213-nm laser gives the same results with lower operational drawbacks. Moreover, the use of mode locked laser could also decrease the risk of heat generation and diffusion.

Keywords: UV lasers, mathematical model, corneal ablation, photochemical ablation

Procedia PDF Downloads 64
4953 Numerical Investigation of Beam-Columns Subjected to Non-Proportional Loadings under Ambient Temperature Conditions

Authors: George Adomako Kumi

Abstract:

The response of structural members, when subjected to various forms of non-proportional loading, plays a major role in the overall stability and integrity of a structure. This research seeks to present the outcome of a finite element investigation conducted by the use of finite element programming software ABAQUS to validate the experimental results of elastic and inelastic behavior and strength of beam-columns subjected to axial loading, biaxial bending, and torsion under ambient temperature conditions. The application of the rigorous and highly complicated ABAQUS finite element software will seek to account for material, non-linear geometry, deformations, and, more specifically, the contact behavior between the beam-columns and support surfaces. Comparisons of the three-dimensional model with the results of actual tests conducted and results from a solution algorithm developed through the use of the finite difference method will be established in order to authenticate the veracity of the developed model. The results of this research will seek to provide structural engineers with much-needed knowledge about the behavior of steel beam columns and their response to various non-proportional loading conditions under ambient temperature conditions.

Keywords: beam-columns, axial loading, biaxial bending, torsion, ABAQUS, finite difference method

Procedia PDF Downloads 160
4952 Grading Fourteen Zones of Isfahan in Terms of the Impact of Globalization on the Urban Fabric of the City, Using the TOPSIS Model

Authors: A. Zahedi Yeganeh, A. Khademolhosseini, R. Mokhtari Malekabadi

Abstract:

Undoubtedly one of the most far-reaching and controversial topics considered in the past few decades, has been globalization. Globalization lies in the essence of the modern culture. It is a complex and rapidly expanding network of links and mutual interdependence that is an aspect of modern life; though some argue that this link existed since the beginning of human history. If we consider globalization as a dynamic social process in which the geographical constraints governing the political, economic, social and cultural relationships have been undermined, it might not be possible to simply describe its impact on the urban fabric. But since in this phenomenon the increase in communications of societies (while preserving the main cultural - regional characteristics) with one another and the increase in the possibility of influencing other societies are discussed, the need for more studies will be felt. The main objective of this study is to grade based on some globalization factors on urban fabric applying the TOPSIS model. The research method is descriptive - analytical and survey. For data analysis, the TOPSIS model and SPSS software were used and the results of GIS software with fourteen cities are shown on the map. The results show that the process of being influenced by the globalization of the urban fabric of fourteen zones of Isfahan was not similar and there have been large differences in this respect between city zones; the most affected areas are zones 5, 6 and 9 of the municipality and the least impact has been on the zones 4 and 3 and 2.

Keywords: grading, globalization, urban fabric, 14 zones of Isfahan, TOPSIS model

Procedia PDF Downloads 298
4951 Transient Level in the Surge Chamber at the Robert-bourassa Generating Station

Authors: Maryam Kamali Nezhad

Abstract:

The Robert-Bourassa development (LG-2), the first to be built on the Grande Rivière, comprises two sets of eight turbines- generator units each, the East and West powerhouses. Each powerhouse has two tailrace tunnels with an average length of about 1178 m. The LG-2A powerhouse houses 6 turbine-generator units. The water is discharged through two tailrace tunnels with a length of about 1330 m. The objective of this work, at RB (LG-2), is; 1) to establish a new maximum transient level in the surge chamber, 2) to define the new maximum equipment flow rate for the future turbine-generator units, 3) to ensure safe access to various intervention locations in the surge chamber. The transient levels under normal operating conditions at the RB plant were determined in 2001 by the Hydraulics Unit of HQE using the "Chamber" software. It is a one-dimensional mass oscillation calculation software; it is used to determine the variation of the water level in the equilibrium chamber located downstream of a power plant during the load shedding of the power plant units; it can also be used in the case of an equilibrium stack upstream of a power plant. The RB (LG-2) plant study is based on the theoretical nominal geometry of the chamber and the tailrace tunnels and the flow-level relationship at the outlet of the galleries established during design. The software is used in such a way that the results have an acceptable margin of safety, especially with respect to the maximum transient level (e.g., resumption of flow at an inopportune time), to take into account the turbulent and three-dimensional aspects of the actual flow in the chamber. Note that the transient levels depend on the water levels in the river and in the steady-state equilibrium chambers. These data are established in the HQP CRP database and updated from time to time. The maximum transient levels in the RB-East and RB-West powerhouses surge chamber were revised based on the latest update (set 4) of in-river rating curves and steady-state surge chamber water levels. The results of the revision were also used to update the technical advice on the operating conditions for the aforementioned surge chamber access while considering revisions to the calculated water levels.

Keywords: generating station, surge chamber, maximum transient level, hydroelectric power station, turbine-generator, reservoir

Procedia PDF Downloads 69
4950 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 179
4949 Dynamic Cellular Remanufacturing System (DCRS) Design

Authors: Tariq Aljuneidi, Akif Asil Bulgak

Abstract:

Remanufacturing may be defined as the process of bringing used products to “like-new” functional state with warranty to match, and it is one of the most popular product end-of-life scenarios. An efficient remanufacturing network lead to an efficient design of sustainable manufacturing enterprise. In remanufacturing network, products are collected from the customer zone, disassembled and remanufactured at a suitable remanufacturing facility. In this respect, another issue to consider is how the returned product to be remanufactured, in other words, what is the best layout for such facility. In order to achieve a sustainable manufacturing system, Cellular Manufacturing System (CMS) designs are highly recommended, CMSs combine high throughput rates of line layouts with the flexibility offered by functional layouts (job shop). Introducing the CMS while designing a remanufacturing network will benefit the utilization of such a network. This paper presents and analyzes a comprehensive mathematical model for the design of Dynamic Cellular Remanufacturing Systems (DCRSs). In this paper, the proposed model is the first one to date that consider CMS and remanufacturing system simultaneously. The proposed DCRS model considers several manufacturing attributes such as multi-period production planning, dynamic system reconfiguration, duplicate machines, machine capacity, available time for workers, worker assignments, and machine procurement, where the demand is totally satisfied from a returned product. A numerical example is presented to illustrate the proposed model.

Keywords: cellular manufacturing system, remanufacturing, mathematical programming, sustainability

Procedia PDF Downloads 358
4948 A Mathematical Analysis of a Model in Capillary Formation: The Roles of Endothelial, Pericyte and Macrophages in the Initiation of Angiogenesis

Authors: Serdal Pamuk, Irem Cay

Abstract:

Our model is based on the theory of reinforced random walks coupled with Michealis-Menten mechanisms which view endothelial cell receptors as the catalysts for transforming both tumor and macrophage derived tumor angiogenesis factor (TAF) into proteolytic enzyme which in turn degrade the basal lamina. The model consists of two main parts. First part has seven differential equations (DE’s) in one space dimension over the capillary, whereas the second part has the same number of DE’s in two space dimensions in the extra cellular matrix (ECM). We connect these two parts via some boundary conditions to move the cells into the ECM in order to initiate capillary formation. But, when does this movement begin? To address this question we estimate the thresholds that activate the transport equations in the capillary. We do this by using steady-state analysis of TAF equation under some assumptions. Once these equations are activated endothelial, pericyte and macrophage cells begin to move into the ECM for the initiation of angiogenesis. We do believe that our results play an important role for the mechanisms of cell migration which are crucial for tumor angiogenesis. Furthermore, we estimate the long time tendency of these three cells, and find that they tend to the transition probability functions as time evolves. We provide our numerical solutions which are in good agreement with our theoretical results.

Keywords: angiogenesis, capillary formation, mathematical analysis, steady-state, transition probability function

Procedia PDF Downloads 144