Search results for: model based clustering
36888 Application of Directed Acyclic Graphs for Threat Identification Based on Ontologies
Authors: Arun Prabhakar
Abstract:
Threat modeling is an important activity carried out in the initial stages of the development lifecycle that helps in building proactive security measures in the product. Though there are many techniques and tools available today, one of the common challenges with the traditional methods is the lack of a systematic approach in identifying security threats. The proposed solution describes an organized model by defining ontologies that help in building patterns to enumerate threats. The concepts of graph theory are applied to build the pattern for discovering threats for any given scenario. This graph-based solution also brings in other benefits, making it a customizable and scalable model.Keywords: directed acyclic graph, ontology, patterns, threat identification, threat modeling
Procedia PDF Downloads 13936887 Lee-Carter Mortality Forecasting Method with Dynamic Normal Inverse Gaussian Mortality Index
Authors: Funda Kul, İsmail Gür
Abstract:
Pension scheme providers have to price mortality risk by accurate mortality forecasting method. There are many mortality-forecasting methods constructed and used in literature. The Lee-Carter model is the first model to consider stochastic improvement trends in life expectancy. It is still precisely used. Mortality forecasting is done by mortality index in the Lee-Carter model. It is assumed that mortality index fits ARIMA time series model. In this paper, we propose and use dynamic normal inverse gaussian distribution to modeling mortality indes in the Lee-Carter model. Using population mortality data for Italy, France, and Turkey, the model is forecasting capability is investigated, and a comparative analysis with other models is ensured by some well-known benchmarking criterions.Keywords: mortality, forecasting, lee-carter model, normal inverse gaussian distribution
Procedia PDF Downloads 36036886 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion
Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You
Abstract:
For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.Keywords: case-based reasoning, internal thread, cold extrusion, process planning
Procedia PDF Downloads 51036885 Health Percentage Evaluation for Satellite Electrical Power System Based on Linear Stresses Accumulation Damage Theory
Authors: Lin Wenli, Fu Linchun, Zhang Yi, Wu Ming
Abstract:
To meet the demands of long-life and high-intelligence for satellites, the electrical power system should be provided with self-health condition evaluation capability. Any over-stress events in operations should be recorded. Based on Linear stresses accumulation damage theory, accumulative damage analysis was performed on thermal-mechanical-electrical united stresses for three components including the solar array, the batteries and the power conditioning unit. Then an overall health percentage evaluation model for satellite electrical power system was built. To obtain the accurate quantity for system health percentage, an automatic feedback closed-loop correction method for all coefficients in the evaluation model was present. The evaluation outputs could be referred as taking earlier fault-forecast and interventions for Ground Control Center or Satellites self.Keywords: satellite electrical power system, health percentage, linear stresses accumulation damage, evaluation model
Procedia PDF Downloads 41136884 Two-Warehouse Inventory Model for Deteriorating Items with Inventory-Level-Dependent Demand under Two Dispatching Policies
Authors: Lei Zhao, Zhe Yuan, Wenyue Kuang
Abstract:
This paper studies two-warehouse inventory models for a deteriorating item considering that the demand is influenced by inventory levels. The problem mainly focuses on the optimal order policy and the optimal order cycle with inventory-level-dependent demand in two-warehouse system for retailers. It considers the different deterioration rates and the inventory holding costs in owned warehouse (OW) and rented warehouse (RW), and the conditions of transportation cost, allowed shortage and partial backlogging. Two inventory models are formulated: last-in first-out (LIFO) model and first-in-first-out (FIFO) model based on the policy choices of LIFO and FIFO, and a comparative analysis of LIFO model and FIFO model is made. The study finds that the FIFO policy is more in line with realistic operating conditions. Especially when the inventory holding cost of OW is high, and there is no difference or big difference between deterioration rates of OW and RW, the FIFO policy has better applicability. Meanwhile, this paper considers the differences between the effects of warehouse and shelf inventory levels on demand, and then builds retailers’ inventory decision model and studies the factors of the optimal order quantity, the optimal order cycle and the average inventory cost per unit time. To minimize the average total cost, the optimal dispatching policies are provided for retailers’ decisions.Keywords: FIFO model, inventory-level-dependent, LIFO model, two-warehouse inventory
Procedia PDF Downloads 27936883 The Establishment of RELAP5/SNAP Model for Kuosheng Nuclear Power Plant
Authors: C. Shih, J. R. Wang, H. C. Chang, S. W. Chen, S. C. Chiang, T. Y. Yu
Abstract:
After the measurement uncertainty recapture (MUR) power uprates, Kuosheng nuclear power plant (NPP) was uprated the power from 2894 MWt to 2943 MWt. For power upgrade, several codes (e.g., TRACE, RELAP5, etc.) were applied to assess the safety of Kuosheng NPP. Hence, the main work of this research is to establish a RELAP5/MOD3.3 model of Kuosheng NPP with SNAP interface. The establishment of RELAP5/SNAP model was referred to the FSAR, training documents, and TRACE model which has been developed and verified before. After completing the model establishment, the startup test scenarios would be applied to the RELAP5/SNAP model. With comparing the startup test data and TRACE analysis results, the applicability of RELAP5/SNAP model would be assessed.Keywords: RELAP5, TRACE, SNAP, BWR
Procedia PDF Downloads 42936882 Modelling Impacts of Global Financial Crises on Stock Volatility of Nigeria Banks
Authors: Maruf Ariyo Raheem, Patrick Oseloka Ezepue
Abstract:
This research aimed at determining most appropriate heteroskedastic model to predicting volatility of 10 major Nigerian banks: Access, United Bank for Africa (UBA), Guaranty Trust, Skye, Diamond, Fidelity, Sterling, Union, ETI and Zenith banks using daily closing stock prices of each of the banks from 2004 to 2014. The models employed include ARCH (1), GARCH (1, 1), EGARCH (1, 1) and TARCH (1, 1). The results show that all the banks returns are highly leptokurtic, significantly skewed and thus non-normal across the four periods except for Fidelity bank during financial crises; findings similar to those of other global markets. There is also strong evidence for the presence of heteroscedasticity, and that volatility persistence during crisis is higher than before the crisis across the 10 banks, with that of UBA taking the lead, about 11 times higher during the crisis. Findings further revealed that Asymmetric GARCH models became dominant especially during financial crises and post crises when the second reforms were introduced into the banking industry by the Central Bank of Nigeria (CBN). Generally, one could say that Nigerian banks returns are volatility persistent during and after the crises, and characterised by leverage effects of negative and positive shocks during these periodsKeywords: global financial crisis, leverage effect, persistence, volatility clustering
Procedia PDF Downloads 52636881 Economic Loss due to Ganoderma Disease in Oil Palm
Authors: K. Assis, K. P. Chong, A. S. Idris, C. M. Ho
Abstract:
Oil palm or Elaeis guineensis is considered as the golden crop in Malaysia. But oil palm industry in this country is now facing with the most devastating disease called as Ganoderma Basal Stem Rot disease. The objective of this paper is to analyze the economic loss due to this disease. There were three commercial oil palm sites selected for collecting the required data for economic analysis. Yield parameter used to measure the loss was the total weight of fresh fruit bunch in six months. The predictors include disease severity, change in disease severity, number of infected neighbor palms, age of palm, planting generation, topography, and first order interaction variables. The estimation model of yield loss was identified by using backward elimination based regression method. Diagnostic checking was conducted on the residual of the best yield loss model. The value of mean absolute percentage error (MAPE) was used to measure the forecast performance of the model. The best yield loss model was then used to estimate the economic loss by using the current monthly price of fresh fruit bunch at mill gate.Keywords: ganoderma, oil palm, regression model, yield loss, economic loss
Procedia PDF Downloads 38936880 Bifurcation and Chaos of the Memristor Circuit
Authors: Wang Zhulin, Min Fuhong, Peng Guangya, Wang Yaoda, Cao Yi
Abstract:
In this paper, a magnetron memristor model based on hyperbolic sine function is presented and the correctness proved by studying the trajectory of its voltage and current phase, and then a memristor chaotic system with the memristor model is presented. The phase trajectories and the bifurcation diagrams and Lyapunov exponent spectrum of the magnetron memristor system are plotted by numerical simulation, and the chaotic evolution with changing the parameters of the system is also given. The paper includes numerical simulations and mathematical model, which confirming that the system, has a wealth of dynamic behavior.Keywords: memristor, chaotic circuit, dynamical behavior, chaotic system
Procedia PDF Downloads 50336879 Alcohol-Containing versus Aqueous-Based Solutions for Skin Preparation in Abdominal Surgery: A Systematic Review and Meta-Analysis
Authors: Dimitra V. Peristeri, Hussameldin M. Nour, Amiya Ahsan, Sameh Abogabal, Krishna K. Singh, Muhammad Shafique Sajid
Abstract:
Introduction: The use of optimal skin antiseptic agents for the prevention of surgical site infection (SSI) is of critical importance, especially during abdominal surgical procedures. Alcohol-based chlorhexidine gluconate (CHG) and aqueous-based povidone-iodine (PVI) are the two most common skin antiseptics used nowadays. The objective of this article is to evaluate the effectiveness of alcohol-based CHG versus aqueous-based PVI used for skin preparation before abdominal surgery to reduce SSIs. Methods: Standard medical databases such as MEDLINE, Embase, Pubmed, and Cochrane Library were searched to find randomised, controlled trials (RCTs) comparing alcohol-based CHG skin preparation versus aqueous-based PVI in patients undergoing abdominal surgery. The combined outcomes of SSIs were calculated using an odds ratio (OR) with 95% confidence intervals (95% CI). All data were analysed using Review Manager (RevMan) Software 5.4, and the meta-analysis was performed with a random effect model analysis. Results: A total of 11 studies, all RCTs, were included (n= 12072 participants), recruiting adult patients undergoing abdominal surgery. In the random effect model analysis, the use of alcohol-based CHG in patients undergoing abdominal surgery was associated with a reduced risk of SSI compared to aqueous-based PVI (OR: 0.84; 95% CI [0.74, 0.96], z= 2.61, p= 0.009). Conclusion: Alcohol-based CHG may be more effective for preventing the risk of SSI compared to aqueous-based PVI agents in abdominal surgery. The conclusion of this meta-analysis may add a guiding value to reinforce current clinical practice guidelines.Keywords: skin preparation, surgical site infection, chlorhexidine, skin antiseptics
Procedia PDF Downloads 11036878 A Collaborative Learning Model in Engineering Science Based on a Cyber-Physical Production Line
Authors: Yosr Ghozzi
Abstract:
The Cyber-Physical Systems terminology has been well received by the industrial community and specifically appropriated in educational settings. Indeed, our latest educational activities are based on the development of experimental platforms on an industrial scale. In fact, we built a collaborative learning model because of an international market study that led us to place ourselves at the heart of this technology. To align with these findings, a competency-based approach study was conducted, and program content was revised by reflecting the projectbased approach. Thus, this article deals with the development of educational devices according to a generated curriculum and specific educational activities while respecting the repository of skills adopted from what constitutes the educational cyber-physical production systems and the laboratories that are compliant and adapted to them. The implementation of these platforms was systematically carried out in the school's workshops spaces. The objective has been twofold, both research and teaching for the students in mechatronics and logistics of the electromechanical department. We act as trainers and industrial experts to involve students in the implementation of possible extension systems around multidisciplinary projects and reconnect with industrial projects for better professional integration.Keywords: education 4.0, competency-based learning, teaching factory, project-based learning, cyber-physical systems, industry 4.0
Procedia PDF Downloads 10736877 Markov Switching of Conditional Variance
Authors: Josip Arneric, Blanka Skrabic Peric
Abstract:
Forecasting of volatility, i.e. returns fluctuations, has been a topic of interest to portfolio managers, option traders and market makers in order to get higher profits or less risky positions. Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most common used models are GARCH type models. As standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance, it is difficult the predict volatility using standard GARCH models. Due to practical limitations of these models different approaches have been proposed in the literature, based on Markov switching models. In such situations models in which the parameters are allowed to change over time are more appropriate because they allow some part of the model to depend on the state of the economy. The empirical analysis demonstrates that Markov switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility for selected emerging markets.Keywords: emerging markets, Markov switching, GARCH model, transition probabilities
Procedia PDF Downloads 45536876 Simulation of Nonlinear Behavior of Reinforced Concrete Slabs Using Rigid Body-Spring Discrete Element Method
Authors: Felix Jr. Garde, Eric Augustus Tingatinga
Abstract:
Most analysis procedures of reinforced concrete (RC) slabs are based on elastic theory. When subjected to large forces, however, slabs deform beyond elastic range and the study of their behavior and performance require nonlinear analysis. This paper presents a numerical model to simulate nonlinear behavior of RC slabs using rigid body-spring discrete element method. The proposed slab model composed of rigid plate elements and nonlinear springs is based on the yield line theory which assumes that the nonlinear behavior of the RC slab subjected to transverse loads is contained in plastic or yield-lines. In this model, the displacement of the slab is completely described by the rigid elements and the deformation energy is concentrated in the flexural springs uniformly distributed at the potential yield lines. The spring parameters are determined from comparison of transverse displacements and stresses developed in the slab obtained using FEM and the proposed model with assumed homogeneous material. Numerical models of typical RC slabs with varying geometry, reinforcement, support conditions, and loading conditions, show reasonable agreement with available experimental data. The model was also shown to be useful in investigating dynamic behavior of slabs.Keywords: RC slab, nonlinear behavior, yield line theory, rigid body-spring discrete element method
Procedia PDF Downloads 32336875 E-Bike FE Model Analysis: Connection Stiffness of Elements with Different DOFs
Authors: Lele Zhang, Hui Leng Choo, Alexander Konyukhov, Shuguang Li
Abstract:
Finite Element (FE) model of simplified e-bike structure was generated by main frame with two tiers, which consisted of pipe, mass, beam, and shell elements (pipe 289, beam188, shell 181, shell 281, combin14, link11, mass21). These elements would be introduced and demonstrated using mathematical formulas. Based on coupling theory, constrain equations was proposed. Exporting all the parameters obtained from theory part, the connection stiffness matrix of the whole e-bike structure between each of these elements was detected.Keywords: coupling theory, stiffness matrix, e-bike, finite element model
Procedia PDF Downloads 37536874 Expounding on the Role of Sustainability Values (SVs) on Consumers’ Switching Intentions Regarding Disruptive 5G Technology in China
Authors: Sayed Kifayat Shah, Tang Zhongjun, Mohammad Ahmad, Sohaib Mostafa
Abstract:
This article investigates consumer’s intention to shift to 5G in the light of disruptive technology innovation. To switch from 4G (Existing) technology to 5G (Disruptive) technology requires not just economic benefits and costs but involves other values too, which aren't yet experienced in the framework of technology innovation. This study extended the valued adaptation (VAM) model by proposing the sustainability values (SVs) construct. The model was examined on data from 361 Chinese consumers using the partial least squares-based structural equation modelling (PLS-SEM) technique. The outcomes prove the significant correlation of sustainability values (SVs) which influences consumer’s switching intentions toward 5G disruptive technology. The findings of this research will be helpful to telecoms firms in developing consumer retention strategies. Some limitations and the importance of the research for scholars and managers are also discussed.Keywords: value adaptation model (VAM), sustainability values (SVs), disruptive 5G technology, switching intentions (SI), partial least squares-based structural equation modelling (PLS-SEM)
Procedia PDF Downloads 14836873 Volatility and Stylized Facts
Authors: Kalai Lamia, Jilani Faouzi
Abstract:
Measuring and controlling risk is one of the most attractive issues in finance. With the persistence of uncontrolled and erratic stocks movements, volatility is perceived as a barometer of daily fluctuations. An objective measure of this variable seems then needed to control risks and cover those that are considered the most important. Non-linear autoregressive modeling is our first evaluation approach. In particular, we test the presence of “persistence” of conditional variance and the presence of a degree of a leverage effect. In order to resolve for the problem of “asymmetry” in volatility, the retained specifications point to the importance of stocks reactions in response to news. Effects of shocks on volatility highlight also the need to study the “long term” behaviour of conditional variance of stocks returns and articulate the presence of long memory and dependence of time series in the long run. We note that the integrated fractional autoregressive model allows for representing time series that show long-term conditional variance thanks to fractional integration parameters. In order to stop at the dynamics that manage time series, a comparative study of the results of the different models will allow for better understanding volatility structure over the Tunisia stock market, with the aim of accurately predicting fluctuation risks.Keywords: asymmetry volatility, clustering, stylised facts, leverage effect
Procedia PDF Downloads 29936872 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran
Authors: Reza Heidari, Maryam Amiri
Abstract:
In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation
Procedia PDF Downloads 40336871 Web-Based Learning in Nursing: The Sample of Delivery Lesson Program
Authors: Merve Kadioğlu, Nevin H. Şahin
Abstract:
Purpose: This research is organized to determine the influence of the web-based learning program. The program has been developed to gain information about normal delivery skill that is one of the topics of nursing students who take the woman health and illness. Material and Methods: The methodology of this study was applied as pre-test post-test single-group quasi-experimental. The pilot study consisted of 28 nursing student study groups who agreed to participate in the study. The findings were gathered via web-based technologies: student information form, information evaluation tests, Web Based Training Material Evaluation Scale and web-based learning environment feedback form. In the analysis of the data, the percentage, frequency and Wilcoxon Signed Ranks Test were used. The Web Based Instruction Program was developed in the light of full learning model, Mayer's research-based multimedia development principles and Gagne's Instructional Activities Model. Findings: The average scores of it was determined in accordance with the web-based educational material evaluation scale: ‘Instructional Suitability’ 4.45, ‘Suitability to Educational Program’ 4.48, ‘Visual Adequacy’ 4.53, ‘Programming Eligibility / Technical Adequacy’ 4.00. Also, the participants mentioned that the program is successful and useful. A significant difference was found between the pre-test and post-test results of the seven modules (p < 0.05). Results: According to pilot study data, the program was rated ‘very good’ by the study group. It was also found to be effective in increasing knowledge about normal labor.Keywords: normal delivery, web-based learning, nursing students, e-learning
Procedia PDF Downloads 17836870 Space Tourism Pricing Model Revolution from Time Independent Model to Time-Space Model
Authors: Kang Lin Peng
Abstract:
Space tourism emerged in 2001 and became famous in 2021, following the development of space technology. The space market is twisted because of the excess demand. Space tourism is currently rare and extremely expensive, with biased luxury product pricing, which is the seller’s market that consumers can not bargain with. Spaceship companies such as Virgin Galactic, Blue Origin, and Space X have been charged space tourism prices from 200 thousand to 55 million depending on various heights in space. There should be a reasonable price based on a fair basis. This study aims to derive a spacetime pricing model, which is different from the general pricing model on the earth’s surface. We apply general relativity theory to deduct the mathematical formula for the space tourism pricing model, which covers the traditional time-independent model. In the future, the price of space travel will be different from current flight travel when space travel is measured in lightyear units. The pricing of general commodities mainly considers the general equilibrium of supply and demand. The pricing model considers risks and returns with the dependent time variable as acceptable when commodities are on the earth’s surface, called flat spacetime. Current economic theories based on the independent time scale in the flat spacetime do not consider the curvature of spacetime. Current flight services flying the height of 6, 12, and 19 kilometers are charging with a pricing model that measures time coordinate independently. However, the emergence of space tourism is flying heights above 100 to 550 kilometers that have enlarged the spacetime curvature, which means tourists will escape from a zero curvature on the earth’s surface to the large curvature of space. Different spacetime spans should be considered in the pricing model of space travel to echo general relativity theory. Intuitively, this spacetime commodity needs to consider changing the spacetime curvature from the earth to space. We can assume the value of each spacetime curvature unit corresponding to the gradient change of each Ricci or energy-momentum tensor. Then we know how much to spend by integrating the spacetime from the earth to space. The concept is adding a price p component corresponding to the general relativity theory. The space travel pricing model degenerates into a time-independent model, which becomes a model of traditional commodity pricing. The contribution is that the deriving of the space tourism pricing model will be a breakthrough in philosophical and practical issues for space travel. The results of the space tourism pricing model extend the traditional time-independent flat spacetime mode. The pricing model embedded spacetime as the general relativity theory can better reflect the rationality and accuracy of space travel on the universal scale. The universal scale from independent-time scale to spacetime scale will bring a brand-new pricing concept for space traveling commodities. Fair and efficient spacetime economics will also bring to humans’ travel when we can travel in lightyear units in the future.Keywords: space tourism, spacetime pricing model, general relativity theory, spacetime curvature
Procedia PDF Downloads 12836869 Formal Verification of Cache System Using a Novel Cache Memory Model
Authors: Guowei Hou, Lixin Yu, Wei Zhuang, Hui Qin, Xue Yang
Abstract:
Formal verification is proposed to ensure the correctness of the design and make functional verification more efficient. As cache plays a vital role in the design of System on Chip (SoC), and cache with Memory Management Unit (MMU) and cache memory unit makes the state space too large for simulation to verify, then a formal verification is presented for such system design. In the paper, a formal model checking verification flow is suggested and a new cache memory model which is called “exhaustive search model” is proposed. Instead of using large size ram to denote the whole cache memory, exhaustive search model employs just two cache blocks. For cache system contains data cache (Dcache) and instruction cache (Icache), Dcache memory model and Icache memory model are established separately using the same mechanism. At last, the novel model is employed to the verification of a cache which is module of a custom-built SoC system that has been applied in practical, and the result shows that the cache system is verified correctly using the exhaustive search model, and it makes the verification much more manageable and flexible.Keywords: cache system, formal verification, novel model, system on chip (SoC)
Procedia PDF Downloads 49636868 Development of Simple-To-Apply Biogas Kinetic Models for the Co-Digestion of Food Waste and Maize Husk
Authors: Owamah Hilary, O. C. Izinyon
Abstract:
Many existing biogas kinetic models are difficult to apply to substrates they were not developed for, as they are substrate specific. Biodegradability kinetic (BIK) model and maximum biogas production potential and stability assessment (MBPPSA) model were therefore developed in this study for the anaerobic co-digestion of food waste and maize husk. Biodegradability constant (k) was estimated as 0.11d-1 using the BIK model. The results of maximum biogas production potential (A) obtained using the MBPPSA model corresponded well with the results obtained using the popular but complex modified Gompertz model for digesters B-1, B-2, B-3, B-4, and B-5. The (If) value of MBPPSA model also showed that digesters B-3, B-4, and B-5 were stable, while B-1 and B-2 were unstable. Similar stability observation was also obtained using the modified Gompertz model. The MBPPSA model can therefore be used as alternative model for anaerobic digestion feasibility studies and plant design.Keywords: biogas, inoculum, model development, stability assessment
Procedia PDF Downloads 42936867 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning
Procedia PDF Downloads 35436866 Low Light Image Enhancement with Multi-Stage Interconnected Autoencoders Integration in Pix to Pix GAN
Authors: Muhammad Atif, Cang Yan
Abstract:
The enhancement of low-light images is a significant area of study aimed at enhancing the quality of captured images in challenging lighting environments. Recently, methods based on convolutional neural networks (CNN) have gained prominence as they offer state-of-the-art performance. However, many approaches based on CNN rely on increasing the size and complexity of the neural network. In this study, we propose an alternative method for improving low-light images using an autoencoder-based multiscale knowledge transfer model. Our method leverages the power of three autoencoders, where the encoders of the first two autoencoders are directly connected to the decoder of the third autoencoder. Additionally, the decoder of the first two autoencoders is connected to the encoder of the third autoencoder. This architecture enables effective knowledge transfer, allowing the third autoencoder to learn and benefit from the enhanced knowledge extracted by the first two autoencoders. We further integrate the proposed model into the PIX to PIX GAN framework. By integrating our proposed model as the generator in the GAN framework, we aim to produce enhanced images that not only exhibit improved visual quality but also possess a more authentic and realistic appearance. These experimental results, both qualitative and quantitative, show that our method is better than the state-of-the-art methodologies.Keywords: low light image enhancement, deep learning, convolutional neural network, image processing
Procedia PDF Downloads 8036865 Current Status and a Forecasting Model of Community Household Waste Generation: A Case Study on Ward 24 (Nirala), Khulna, Bangladesh
Authors: Md. Nazmul Haque, Mahinur Rahman
Abstract:
The objective of the research is to determine the quantity of household waste generated and forecast the future condition of Ward No 24 (Nirala). For performing that, three core issues are focused: (i) the capacity and service area of the dumping stations; (ii) the present waste generation amount per capita per day; (iii) the responsibility of the local authority in the household waste collection. This research relied on field survey-based data collection from all stakeholders and GIS-based secondary analysis of waste collection points and their coverage. However, these studies are mostly based on the inherent forecasting approaches, cannot predict the amount of waste correctly. The findings of this study suggest that Nirala is a formal residential area introducing a better approach to the waste collection - self-controlled and collection system. Here, a forecasting model proposed for waste generation as Y = -2250387 + 1146.1 * X, where X = year.Keywords: eco-friendly environment, household waste, linear regression, waste management
Procedia PDF Downloads 28536864 Molding Properties of Cobalt-Chrome-Based Feedstocks Used in Low-Pressure Powder Injection Molding
Authors: Ehsan Gholami, Vincent Demers
Abstract:
Low-pressure powder injection molding is an emerging technology for cost-effectively producing complex shape metallic parts with the proper dimensional tolerances, either in high or in low production volumes. In this study, the molding properties of cobalt-chrome-based feedstocks were evaluated for use in a low-pressure powder injection molding process. The rheological properties of feedstock formulations were obtained by mixing metallic powder with a proprietary wax-based binder system. Rheological parameters such as reference viscosity, shear rate sensitivity index, and activation energy for viscous flow, were extracted from the viscosity profiles and introduced into the Weir model to calculate the moldability index. Feedstocks were experimentally injected into a spiral mold cavity to validate the injection performance calculated with the model.Keywords: binder, feedstock, moldability, powder injection molding, viscosity
Procedia PDF Downloads 27336863 Time/Temperature-Dependent Finite Element Model of Laminated Glass Beams
Authors: Alena Zemanová, Jan Zeman, Michal Šejnoha
Abstract:
The polymer foil used for manufacturing of laminated glass members behaves in a viscoelastic manner with temperature dependence. This contribution aims at incorporating the time/temperature-dependent behavior of interlayer to our earlier elastic finite element model for laminated glass beams. The model is based on a refined beam theory: each layer behaves according to the finite-strain shear deformable formulation by Reissner and the adjacent layers are connected via the Lagrange multipliers ensuring the inter-layer compatibility of a laminated unit. The time/temperature-dependent behavior of the interlayer is accounted for by the generalized Maxwell model and by the time-temperature superposition principle due to the Williams, Landel, and Ferry. The resulting system is solved by the Newton method with consistent linearization and the viscoelastic response is determined incrementally by the exponential algorithm. By comparing the model predictions against available experimental data, we demonstrate that the proposed formulation is reliable and accurately reproduces the behavior of the laminated glass units.Keywords: finite element method, finite-strain Reissner model, Lagrange multipliers, generalized Maxwell model, laminated glass, Newton method, Williams-Landel-Ferry equation
Procedia PDF Downloads 43136862 Probabilistic Model for Evaluating Seismic Soil Liquefaction Based on Energy Approach
Authors: Hamid Rostami, Ali Fallah Yeznabad, Mohammad H. Baziar
Abstract:
The energy-based method for evaluating seismic soil liquefaction has two main sections. First is the demand energy, which is dissipated energy of earthquake at a site, and second is the capacity energy as a representation of soil resistance against liquefaction hazard. In this study, using a statistical analysis of recorded data by 14 down-hole array sites in California, an empirical equation was developed to estimate the demand energy at sites. Because determination of capacity energy at a site needs to calculate several site calibration factors, which are obtained by experimental tests, in this study the standard penetration test (SPT) N-value was assumed as an alternative to the capacity energy at a site. Based on this assumption, the empirical equation was employed to calculate the demand energy for 193 liquefied and no-liquefied sites and then these amounts were plotted versus the corresponding SPT numbers for all sites. Subsequently, a discrimination analysis was employed to determine the equations of several boundary curves for various liquefaction likelihoods. Finally, a comparison was made between the probabilistic model and the commonly used stress method. As a conclusion, the results clearly showed that energy-based method can be more reliable than conventional stress-based method in evaluation of liquefaction occurrence.Keywords: energy demand, liquefaction, probabilistic analysis, SPT number
Procedia PDF Downloads 36736861 The Role of Artificial Intelligence Algorithms in Psychiatry: Advancing Diagnosis and Treatment
Authors: Netanel Stern
Abstract:
Artificial intelligence (AI) algorithms have emerged as powerful tools in the field of psychiatry, offering new possibilities for enhancing diagnosis and treatment outcomes. This article explores the utilization of AI algorithms in psychiatry, highlighting their potential to revolutionize patient care. Various AI algorithms, including machine learning, natural language processing (NLP), reinforcement learning, clustering, and Bayesian networks, are discussed in detail. Moreover, ethical considerations and future directions for research and implementation are addressed.Keywords: AI, software engineering, psychiatry, neuroimaging
Procedia PDF Downloads 11636860 Physical Theory for One-Dimensional Correlated Electron Systems
Authors: Nelson Nenuwe
Abstract:
The behavior of interacting electrons in one dimension was studied by calculating correlation functions and critical exponents at zero and external magnetic fields for arbitrary band filling. The technique employed in this study is based on the conformal field theory (CFT). The charge and spin degrees of freedom are separated, and described by two independent conformal theories. A detailed comparison of the t-J model with the repulsive Hubbard model was then undertaken with emphasis on their Tomonaga-Luttinger (TL) liquid properties. Near half-filling the exponents of the t-J model take the values of the strong-correlation limit of the Hubbard model, and in the low-density limit the exponents are those of a non-interacting system. The critical exponents obtained in this study belong to the repulsive TL liquid (conducting phase) and attractive TL liquid (superconducting phase). The theoretical results from this study find applications in one-dimensional organic conductors (TTF-TCNQ), organic superconductors (Bechgaard salts) and carbon nanotubes (SWCNTs, DWCNTs and MWCNTs). For instance, the critical exponent at from this study is consistent with the experimental result from optical and photoemission evidence of TL liquid in one-dimensional metallic Bechgaard salt- (TMTSF)2PF6.Keywords: critical exponents, conformal field theory, Hubbard model, t-J model
Procedia PDF Downloads 34336859 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue
Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni
Abstract:
Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM
Procedia PDF Downloads 331