Search results for: computational intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3306

Search results for: computational intelligence

846 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network

Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui

Abstract:

Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.

Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN

Procedia PDF Downloads 107
845 Career Guidance System Using Machine Learning

Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan

Abstract:

Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should evaluate properly their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, neural networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable of offering an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.

Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills

Procedia PDF Downloads 48
844 Physics Informed Deep Residual Networks Based Type-A Aortic Dissection Prediction

Authors: Joy Cao, Min Zhou

Abstract:

Purpose: Acute Type A aortic dissection is a well-known cause of extremely high mortality rate. A highly accurate and cost-effective non-invasive predictor is critically needed so that the patient can be treated at earlier stage. Although various CFD approaches have been tried to establish some prediction frameworks, they are sensitive to uncertainty in both image segmentation and boundary conditions. Tedious pre-processing and demanding calibration procedures requirement further compound the issue, thus hampering their clinical applicability. Using the latest physics informed deep learning methods to establish an accurate and cost-effective predictor framework are amongst the main goals for a better Type A aortic dissection treatment. Methods: Via training a novel physics-informed deep residual network, with non-invasive 4D MRI displacement vectors as inputs, the trained model can cost-effectively calculate all these biomarkers: aortic blood pressure, WSS, and OSI, which are used to predict potential type A aortic dissection to avoid the high mortality events down the road. Results: The proposed deep learning method has been successfully trained and tested with both synthetic 3D aneurysm dataset and a clinical dataset in the aortic dissection context using Google colab environment. In both cases, the model has generated aortic blood pressure, WSS, and OSI results matching the expected patient’s health status. Conclusion: The proposed novel physics-informed deep residual network shows great potential to create a cost-effective, non-invasive predictor framework. Additional physics-based de-noising algorithm will be added to make the model more robust to clinical data noises. Further studies will be conducted in collaboration with big institutions such as Cleveland Clinic with more clinical samples to further improve the model’s clinical applicability.

Keywords: type-a aortic dissection, deep residual networks, blood flow modeling, data-driven modeling, non-invasive diagnostics, deep learning, artificial intelligence.

Procedia PDF Downloads 64
843 A Study of Using Multiple Subproblems in Dantzig-Wolfe Decomposition of Linear Programming

Authors: William Chung

Abstract:

This paper is to study the use of multiple subproblems in Dantzig-Wolfe decomposition of linear programming (DW-LP). Traditionally, the decomposed LP consists of one LP master problem and one LP subproblem. The master problem and the subproblem is solved alternatively by exchanging the dual prices of the master problem and the proposals of the subproblem until the LP is solved. It is well known that convergence is slow with a long tail of near-optimal solutions (asymptotic convergence). Hence, the performance of DW-LP highly depends upon the number of decomposition steps. If the decomposition steps can be greatly reduced, the performance of DW-LP can be improved significantly. To reduce the number of decomposition steps, one of the methods is to increase the number of proposals from the subproblem to the master problem. To do so, we propose to add a quadratic approximation function to the LP subproblem in order to develop a set of approximate-LP subproblems (multiple subproblems). Consequently, in each decomposition step, multiple subproblems are solved for providing multiple proposals to the master problem. The number of decomposition steps can be reduced greatly. Note that each approximate-LP subproblem is nonlinear programming, and solving the LP subproblem must faster than solving the nonlinear multiple subproblems. Hence, using multiple subproblems in DW-LP is the tradeoff between the number of approximate-LP subproblems being formed and the decomposition steps. In this paper, we derive the corresponding algorithms and provide some simple computational results. Some properties of the resulting algorithms are also given.

Keywords: approximate subproblem, Dantzig-Wolfe decomposition, large-scale models, multiple subproblems

Procedia PDF Downloads 138
842 Land Cover Remote Sensing Classification Advanced Neural Networks Supervised Learning

Authors: Eiman Kattan

Abstract:

This study aims to evaluate the impact of classifying labelled remote sensing images conventional neural network (CNN) architecture, i.e., AlexNet on different land cover scenarios based on two remotely sensed datasets from different point of views such as the computational time and performance. Thus, a set of experiments were conducted to specify the effectiveness of the selected convolutional neural network using two implementing approaches, named fully trained and fine-tuned. For validation purposes, two remote sensing datasets, AID, and RSSCN7 which are publicly available and have different land covers features were used in the experiments. These datasets have a wide diversity of input data, number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in training, validation, and testing. As a result, the fully trained approach has achieved a trivial result for both of the two data sets, AID and RSSCN7 by 73.346% and 71.857% within 24 min, 1 sec and 8 min, 3 sec respectively. However, dramatic improvement of the classification performance using the fine-tuning approach has been recorded by 92.5% and 91% respectively within 24min, 44 secs and 8 min 41 sec respectively. The represented conclusion opens the opportunities for a better classification performance in various applications such as agriculture and crops remote sensing.

Keywords: conventional neural network, remote sensing, land cover, land use

Procedia PDF Downloads 344
841 A Flexible Bayesian State-Space Modelling for Population Dynamics of Wildlife and Livestock Populations

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Hans-Peter Piepho

Abstract:

We aim to model dynamics of wildlife or pastoral livestock population for understanding of their population change and hence for wildlife conservation and promoting human welfare. The study is motivated by an age-sex structured population counts in different regions of Serengeti-Mara during the period 1989-2003. Developing reliable and realistic models for population dynamics of large herbivore population can be a very complex and challenging exercise. However, the Bayesian statistical domain offers some flexible computational methods that enable the development and efficient implementation of complex population dynamics models. In this work, we have used a novel Bayesian state-space model to analyse the dynamics of topi and hartebeest populations in the Serengeti-Mara Ecosystem of East Africa. The state-space model involves survival probabilities of the animals which further depend on various factors like monthly rainfall, size of habitat, etc. that cause recent declines in numbers of the herbivore populations and potentially threaten their future population viability in the ecosystem. Our study shows that seasonal rainfall is the most important factors shaping the population size of animals and indicates the age-class which most severely affected by any change in weather conditions.

Keywords: bayesian state-space model, Markov Chain Monte Carlo, population dynamics, conservation

Procedia PDF Downloads 182
840 Accelerated Structural Reliability Analysis under Earthquake-Induced Tsunamis by Advanced Stochastic Simulation

Authors: Sai Hung Cheung, Zhe Shao

Abstract:

Recent earthquake-induced tsunamis in Padang, 2004 and Tohoku, 2011 brought huge losses of lives and properties. Maintaining vertical evacuation systems is the most crucial strategy to effectively reduce casualty during the tsunami event. Thus, it is of our great interest to quantify the risk to structural dynamic systems due to earthquake-induced tsunamis. Despite continuous advancement in computational simulation of the tsunami and wave-structure interaction modeling, it still remains computationally challenging to evaluate the reliability (or its complement failure probability) of a structural dynamic system when uncertainties related to the system and its modeling are taken into account. The failure of the structure in a tsunami-wave-structural system is defined as any response quantities of the system exceeding specified thresholds during the time when the structure is subjected to dynamic wave impact due to earthquake-induced tsunamis. In this paper, an approach based on a novel integration of the Subset Simulation algorithm and a recently proposed moving least squares response surface approach for stochastic sampling is proposed. The effectiveness of the proposed approach is discussed by comparing its results with those obtained from the Subset Simulation algorithm without using the response surface approach.

Keywords: response surface model, subset simulation, structural reliability, Tsunami risk

Procedia PDF Downloads 352
839 Towards Developing Social Assessment Tool for Siwan Ecolodge Case Study: Babenshal Ecolodge

Authors: Amr Ali Bayoumi, Ola Ali Bayoumi

Abstract:

The aim of this research is enhancing one of the main aspects (Social Aspect) for developing an eco-lodge in Siwa oasis in Egyptian Western Desert. According to credible weightings built in this research through formal and informal questionnaires, the researcher detected one of the highest credible aspects, 'Social Aspect': through which it carries the maximum priorities among the total environmental and economic categories. From here, the researcher suggested the usage of ethnographic design approach and Space Syntax as observational and computational methods for developing future Eco-lodge in Siwa Oasis. These methods are used to study social spaces of Babenshal eco-lodge as a case study. This hybrid method is considered as a beginning of building Social Assessment Tool (SAT) for ecological tourism buildings located in Siwa as a case of Egyptian Western desert community. Towards livable social spaces, the proposed SAT was planned to be the optimum measurable weightings for social aspect's priorities of future Siwan eco-lodge(s). Finally, recommendations are proposed for enhancing SAT to be more correlated with sensitive desert biome (Siwa Oasis) to be adapted with the continuous social and environmental changes of the oasis.

Keywords: ecolodge, social aspect, space syntax, Siwa Oasis

Procedia PDF Downloads 107
838 Analysis of One-Way and Two-Way FSI Approaches to Characterise the Flow Regime and the Mechanical Behaviour during Closing Manoeuvring Operation of a Butterfly Valve

Authors: M. Ezkurra, J. A. Esnaola, M. Martinez-Agirre, U. Etxeberria, U. Lertxundi, L. Colomo, M. Begiristain, I. Zurutuza

Abstract:

Butterfly valves are widely used industrial piping components as on-off and flow controlling devices. The main challenge in the design process of this type of valves is the correct dimensioning to ensure proper mechanical performance as well as to minimise flow losses that affect the efficiency of the system. Butterfly valves are typically dimensioned in a closed position based on mechanical approaches considering uniform hydrostatic pressure, whereas the flow losses are analysed by means of CFD simulations. The main limitation of these approaches is that they do not consider either the influence of the dynamics of the manoeuvring stage or coupled phenomena. Recent works have included the influence of the flow on the mechanical behaviour for different opening angles by means of one-way FSI approach. However, these works consider steady-state flow for the selected angles, not capturing the effect of the transient flow evolution during the manoeuvring stage. Two-way FSI modelling approach could allow overcoming such limitations providing more accurate results. Nevertheless, the use of this technique is limited due to the increase in the computational cost. In the present work, the applicability of FSI one-way and two-way approaches is evaluated for the analysis of butterfly valves, showing that not considering fluid-structure coupling involves not capturing the most critical situation for the valve disc.

Keywords: butterfly valves, fluid-structure interaction, one-way approach, two-way approach

Procedia PDF Downloads 145
837 Using Serious Games to Integrate the Potential of Mass Customization into the Fuzzy Front-End of New Product Development

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Mass customization is the idea of offering custom products or services to satisfy the needs of each individual customer while maintaining the efficiency of mass production. Technologies like 3D printing and artificial intelligence have many start-ups hoping to capitalize on this dream of creating personalized products at an affordable price, and well established companies scrambling to innovate and maintain their market share. However, the majority of them are failing as they struggle to understand one key question – where does customization make sense? Customization and personalization only make sense where the value of the perceived benefit outweighs the cost to implement it. In other words, will people pay for it? Looking at the Kano Model makes it clear that it depends on the product. In products where customization is an inherent need, like prosthetics, mass customization technologies can be highly beneficial. However, for products that already sell as a standard, like headphones, offering customization is likely only an added bonus, and so the product development team must figure out if the customers’ perception of the added value of this feature will outweigh its premium price tag. This can be done through the use of a ‘serious game,’ whereby potential customers are given a limited budget to collaboratively buy and bid on potential features of the product before it is developed. If the group choose to buy customization over other features, then the product development team should implement it into their design. If not, the team should prioritize the features on which the customers have spent their budget. The level of customization purchased can also be translated to an appropriate production method, for example, the most expensive type of customization would likely be free-form design and could be achieved through digital fabrication, while a lower level could be achieved through short batch production. Twenty-five teams of final year students from design, engineering, construction and technology tested this methodology when bringing a product from concept through to production specification, and found that it allowed them to confidently decide what level of customization, if any, would be worth offering for their product, and what would be the best method of producing it. They also found that the discussion and negotiations between players during the game led to invaluable insights, and often decided to play a second game where they offered customers the option to buy the various customization ideas that had been discussed during the first game.

Keywords: Kano model, mass customization, new product development, serious game

Procedia PDF Downloads 114
836 Determining Optimal Number of Trees in Random Forests

Authors: Songul Cinaroglu

Abstract:

Background: Random Forest is an efficient, multi-class machine learning method using for classification, regression and other tasks. This method is operating by constructing each tree using different bootstrap sample of the data. Determining the number of trees in random forests is an open question in the literature for studies about improving classification performance of random forests. Aim: The aim of this study is to analyze whether there is an optimal number of trees in Random Forests and how performance of Random Forests differ according to increase in number of trees using sample health data sets in R programme. Method: In this study we analyzed the performance of Random Forests as the number of trees grows and doubling the number of trees at every iteration using “random forest” package in R programme. For determining minimum and optimal number of trees we performed Mc Nemar test and Area Under ROC Curve respectively. Results: At the end of the analysis it was found that as the number of trees grows, it does not always means that the performance of the forest is better than forests which have fever trees. In other words larger number of trees only increases computational costs but not increases performance results. Conclusion: Despite general practice in using random forests is to generate large number of trees for having high performance results, this study shows that increasing number of trees doesn’t always improves performance. Future studies can compare different kinds of data sets and different performance measures to test whether Random Forest performance results change as number of trees increase or not.

Keywords: classification methods, decision trees, number of trees, random forest

Procedia PDF Downloads 377
835 The Effect of Artificial Intelligence on Petroleum Industry and Production

Authors: Mina Shokry Hanna Saleh Tadros

Abstract:

The centrality of the Petroleum Industry in the world energy is undoubted. The world economy almost runs and depends on petroleum. Petroleum industry is a multi-trillion industry; it turns otherwise poor and underdeveloped countries into wealthy nations and thrusts them at the center of international diplomacy. Although these developing nations lack the necessary technology to explore and exploit petroleum resources they are not without help as developed nations, represented by their multinational corporations are ready and willing to provide both the technical and managerial expertise necessary for the development of this natural resource. However, the exploration of these petroleum resources comes with, sometimes, grave, concomitant consequences. These consequences are especially pronounced with respect to the environment. From the British Petroleum Oil rig explosion and the resultant oil spillage and pollution in New Mexico, United States to the Mobil Oil spillage along Egyptian coast, the story and consequence is virtually the same. Egypt’s delta Region produces Nigeria’s petroleum which accounts for more than ninety-five percent of Nigeria’s foreign exchange earnings. Between 1999 and 2007, Egypt earned more than $400 billion from petroleum exports. Nevertheless, petroleum exploration and exploitation has devastated the Delta environment. From oil spillage which pollutes the rivers, farms and wetlands to gas flaring by the multi-national corporations; the consequences is similar-a region that has been devastated by petroleum exploitation. This paper thus seeks to examine the consequences and impact of petroleum pollution in the Egypt Delta with particular reference on the right of the people of Niger Delta to a healthy environment. The paper further seeks to examine the relevant international, regional instrument and Nigeria’s municipal laws that are meant to protect the result of the people of the Egypt Delta and their enforcement by the Nigerian State. It is quite worrisome that the Egypt Delta Region and its people have suffered and are still suffering grave violations of their right to a healthy environment as a result of petroleum exploitation in their region. The Egypt effort at best is half-hearted in its protection of the people’s right.

Keywords: crude oil, fire, floating roof tank, lightning protection systemenvironment, exploration, petroleum, pollutionDuvernay petroleum system, oil generation, oil-source correlation, Re-Os

Procedia PDF Downloads 49
834 Airflow Characteristics and Thermal Comfort of Air Diffusers: A Case Study

Authors: Tolga Arda Eraslan

Abstract:

The quality of the indoor environment is significant to occupants’ health, comfort, and productivity, as Covid-19 spread throughout the world, people started spending most of their time indoors. Since buildings are getting bigger, mechanical ventilation systems are widely used where natural ventilation is insufficient. Four primary tasks of a ventilation system have been identified indoor air quality, comfort, contamination control, and energy performance. To fulfill such requirements, air diffusers, which are a part of the ventilation system, have begun to enter our lives in different airflow distribution systems. Detailed observations are needed to assure that such devices provide high levels of comfort effectiveness and energy efficiency. This study addresses these needs. The objective of this article is to observe air characterizations of different air diffusers at different angles and their effect on people by the thermal comfort model in CFD simulation and to validate the outputs with the help of data results based on a simulated office room. Office room created to provide validation; Equipped with many thermal sensors, including head height, tabletop, and foot level. In addition, CFD simulations were carried out by measuring the temperature and velocity of the air coming out of the supply diffuser. The results considering the flow interaction between diffusers and surroundings showed good visual illustration.

Keywords: computational fluid dynamics, fanger’s model, predicted mean vote, thermal comfort

Procedia PDF Downloads 92
833 Investigating the Role of Dystrophin in Neuronal Homeostasis

Authors: Samantha Shallop, Hakinya Karra, Tytus Bernas, Gladys Shaw, Gretchen Neigh, Jeffrey Dupree, Mathula Thangarajh

Abstract:

Abnormal neuronal homeostasis is considered a structural correlate of cognitive deficits in Duchenne Muscular Dystrophy. Neurons are highly polarized cells with multiple dendrites but a single axon. Trafficking of cellular organelles are highly regulated, with the cargo in the somatodendritic region of the neuron not permitted to enter the axonal compartment. We investigated the molecular mechanisms that regular organelle trafficking in neurons using a multimodal approach, including high-resolution structural illumination, proteomics, immunohistochemistry, and computational modeling. We investigated the expression of ankyrin-G, the master regulator controlling neuronal polarity. The expression of ankyrin G and the morphology of the axon initial segment was profoundly abnormal in the CA1 hippocampal neurons in the mdx52 animal model of DMD. Ankyrin-G colocalized with kinesin KIF5a, the anterograde protein transporter, with higher levels in older mdx52 mice than younger mdx52 mice. These results suggest that the functional trafficking from the somatodendritic compartment is abnormal. Our data suggests that dystrophin deficiency compromised neuronal homeostasis via ankyrin-G-based mechanisms.

Keywords: neurons, axonal transport, duchenne muscular dystrophy, organelle transport

Procedia PDF Downloads 71
832 Enhancing Aerodynamic Performance of Savonius Vertical Axis Turbine Used with Triboelectric Generator

Authors: Bhavesh Dadhich, Fenil Bamnoliya, Akshita Swaminathan

Abstract:

This project aims to design a system to generate energy from flowing wind due to the motion of a vehicle on the road or from the flow of wind in compact areas to utilize the wasteful energy into a useful one. It is envisaged through a design and aerodynamic performance improvement of a Savonius vertical axis wind turbine rotor and used in an integrated system with a Triboelectric Nanogenerator (TENG) that can generate a good amount of electrical energy. Aerodynamic calculations are performed numerically using Computational Fluid Dynamics software, and TENG's performance is evaluated analytically. The Turbine's coefficient of power is validated with published results for an inlet velocity of 7 m/s with a Tip Speed Ratio of 0.75 and found to reasonably agree with that of experiment results. The baseline design is modified with a new blade arc angle and rotor position angle based on the recommended parameter ranges suggested by previous researchers. Simulations have been performed for different T.S.R. values ranging from 0.25 to 1.5 with an interval of 0.25 with two applicable free stream velocities of 5 m/s and 7m/s. Finally, the newly designed VAWT CFD performance results are used as input for the analytical performance prediction of the triboelectric nanogenerator. The results show that this approach could be feasible and useful for small power source applications.

Keywords: savonius turbine, power, overlap ratio, tip speed ratio, TENG

Procedia PDF Downloads 98
831 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 107
830 Numerical Investigation of AL₂O₃ Nanoparticle Effect on a Boiling Forced Swirl Flow Field

Authors: Ataollah Rabiee1, Amir Hossein Kamalinia, Alireza Atf

Abstract:

One of the most important issues in the design of nuclear fusion power plants is the heat removal from the hottest region at the diverter. Various methods could be employed in order to improve the heat transfer efficiency, such as generating turbulent flow and injection of nanoparticles in the host fluid. In the current study, Water/AL₂O₃ nanofluid forced swirl flow boiling has been investigated by using a homogeneous thermophysical model within the Eulerian-Eulerian framework through a twisted tape tube, and the boiling phenomenon was modeled using the Rensselaer Polytechnic Institute (RPI) approach. In addition to comparing the results with the experimental data and their reasonable agreement, it was evidenced that higher flow mixing results in more uniform bulk temperature and lower wall temperature along the twisted tape tube. The presence of AL₂O₃ nanoparticles in the boiling flow field showed that increasing the nanoparticle concentration leads to a reduced vapor volume fraction and wall temperature. The Computational fluid dynamics (CFD) results show that the average heat transfer coefficient in the tube increases both by increasing the nanoparticle concentration and the insertion of twisted tape, which significantly affects the thermal field of the boiling flow.

Keywords: nanoparticle, boiling, CFD, two phase flow, alumina, ITER

Procedia PDF Downloads 103
829 The Impact of Artificial Intelligence on Legislations and Laws

Authors: Keroles Akram Saed Ghatas

Abstract:

The near future will bring significant changes in modern organizations and management due to the growing role of intangible assets and knowledge workers. The area of copyright, intellectual property, digital (intangible) assets and media redistribution appears to be one of the greatest challenges facing business and society in general and management sciences and organizations in particular. The proposed article examines the views and perceptions of fairness in digital media sharing among Harvard Law School's LL.M.s. Students, based on 50 qualitative interviews and 100 surveys. The researcher took an ethnographic approach to her research and entered the Harvard LL.M. in 2016. at, a Face book group that allows people to connect naturally and attend in-person and private events more easily. After listening to numerous students, the researcher conducted a quantitative survey among 100 respondents to assess respondents' perceptions of fairness in digital file sharing in various contexts (based on media price, its availability, regional licenses, copyright holder status, etc.). to understand better . .). Based on the survey results, the researcher conducted long-term, open-ended and loosely structured ethnographic interviews (50 interviews) to further deepen the understanding of the results. The most important finding of the study is that Harvard lawyers generally support digital piracy in certain contexts, despite having the best possible legal and professional knowledge. Interestingly, they are also more accepting of working for the government than the private sector. The results of this study provide a better understanding of how “fairness” is perceived by the younger generation of lawyers and pave the way for a more rational application of licensing laws.

Keywords: cognitive impairments, communication disorders, death penalty, executive function communication disorders, cognitive disorders, capital murder, executive function death penalty, egyptian law absence, justice, political cases piracy, digital sharing, perception of fairness, legal profession

Procedia PDF Downloads 39
828 De Novo Design of a Minimal Catalytic Di-Nickel Peptide Capable of Sustained Hydrogen Evolution

Authors: Saroj Poudel, Joshua Mancini, Douglas Pike, Jennifer Timm, Alexei Tyryshkin, Vikas Nanda, Paul Falkowski

Abstract:

On the early Earth, protein-metal complexes likely harvested energy from a reduced environment. These complexes would have been precursors to the metabolic enzymes of ancient organisms. Hydrogenase is an essential enzyme in most anaerobic organisms for the reduction and oxidation of hydrogen in the environment and is likely one of the earliest evolved enzymes. To attempt to reinvent a precursor to modern hydrogenase, we computationally designed a short thirteen amino acid peptide that binds the often-required catalytic transition metal Nickel in hydrogenase. This simple complex can achieve hundreds of hydrogen evolution cycles using light energy in a broad range of temperature and pH. Biophysical and structural investigations strongly indicate the peptide forms a di-nickel active site analogous to Acetyl-CoA synthase, an ancient protein central to carbon reduction in the Wood-Ljungdahl pathway and capable of hydrogen evolution. This work demonstrates that prior to the complex evolution of multidomain enzymes, early peptide-metal complexes could have catalyzed energy transfer from the environment on the early Earth and enabled the evolution of modern metabolism

Keywords: hydrogenase, prebiotic enzyme, metalloenzyme, computational design

Procedia PDF Downloads 197
827 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing

Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah

Abstract:

The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.

Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing

Procedia PDF Downloads 401
826 Continuous Plug Flow and Discrete Particle Phase Coupling Using Triangular Parcels

Authors: Anders Schou Simonsen, Thomas Condra, Kim Sørensen

Abstract:

Various processes are modelled using a discrete phase, where particles are seeded from a source. Such particles can represent liquid water droplets, which are affecting the continuous phase by exchanging thermal energy, momentum, species etc. Discrete phases are typically modelled using parcel, which represents a collection of particles, which share properties such as temperature, velocity etc. When coupling the phases, the exchange rates are integrated over the cell, in which the parcel is located. This can cause spikes and fluctuating exchange rates. This paper presents an alternative method of coupling a discrete and a continuous plug flow phase. This is done using triangular parcels, which span between nodes following the dynamics of single droplets. Thus, the triangular parcels are propagated using the corner nodes. At each time step, the exchange rates are spatially integrated over the surface of the triangular parcels, which yields a smooth continuous exchange rate to the continuous phase. The results shows that the method is more stable, converges slightly faster and yields smooth exchange rates compared with the steam tube approach. However, the computational requirements are about five times greater, so the applicability of the alternative method should be limited to processes, where the exchange rates are important. The overall balances of the exchanged properties did not change significantly using the new approach.

Keywords: CFD, coupling, discrete phase, parcel

Procedia PDF Downloads 242
825 Talking Back to Hollywood: Museum Representation in Popular Culture as a Gateway to Understanding Public Perception

Authors: Jessica BrodeFrank, Beka Bryer, Lacey Wilson, Sierra Van Ryck deGroot

Abstract:

Museums are enjoying quite the moment in pop culture. From discussions of labor in Bob’s Burger to introducing cultural repatriation in The Black Panther, discussions of various museum issues are making their way to popular media. “Talking Back to Hollywood” analyzes the impact museums have on movies and television. The paper will highlight a series of cultural cameos and discuss what each reveals about critical themes in museums: repatriation, labor, obfuscated histories, institutional legacies, artificial intelligence, and holograms. Using a mixed methods approach to include surveys, descriptive research, thematic analysis, and context analysis, the authors of this paper will explore how we, as the museum staff, might begin to cite museums and movies together as texts. Drawing from their experience working in museums and public history, this contingent of mid-career professionals will highlight the impact museums have had on movies and television and the didactic lessons these portrayals can provide back to cultural heritage professionals. From tackling critical themes in museums such as repatriation, labor conditions/inequities, obfuscated histories, curatorial choice and control, institutional legacies, and more, this paper is grounded in the cultural zeitgeist of the 2000s and the message these media portrayals send to the public and the cultural heritage sector. In particular, the paper will examine how portrayals of AI, holograms, and more technology can be used as entry points for necessary discussions with the public on mistrust, misinformation, and emerging technologies. This paper will not only expose the legacy and cultural understanding of the museum field within popular culture but also will discuss actionable ways that public historians can use these portrayals as an entry point for discussions with the public, citing literature reviews and quantitative and qualitative analysis of survey results. As Hollywood is talking about museums, museums can use that to better connect to the audiences who feel comfortable at the cinema but are excluded from the museum.

Keywords: museums, public memory, representation, popular culture

Procedia PDF Downloads 59
824 Detection of Important Biological Elements in Drug-Drug Interaction Occurrence

Authors: Reza Ferdousi, Reza Safdari, Yadollah Omidi

Abstract:

Drug-drug interactions (DDIs) are main cause of the adverse drug reactions and nature of the functional and molecular complexity of drugs behavior in human body make them hard to prevent and treat. With the aid of new technologies derived from mathematical and computational science the DDIs problems can be addressed with minimum cost and efforts. Market basket analysis is known as powerful method to identify co-occurrence of thing to discover patterns and frequency of the elements. In this research, we used market basket analysis to identify important bio-elements in DDIs occurrence. For this, we collected all known DDIs from DrugBank. The obtained data were analyzed by market basket analysis method. We investigated all drug-enzyme, drug-carrier, drug-transporter and drug-target associations. To determine the importance of the extracted bio-elements, extracted rules were evaluated in terms of confidence and support. Market basket analysis of the over 45,000 known DDIs reveals more than 300 important rules that can be used to identify DDIs, CYP 450 family were the most frequent shared bio-elements. We applied extracted rules over 2,000,000 unknown drug pairs that lead to discovery of more than 200,000 potential DDIs. Analysis of the underlying reason behind the DDI phenomena can help to predict and prevent DDI occurrence. Ranking of the extracted rules based on strangeness of them can be a supportive tool to predict the outcome of an unknown DDI.

Keywords: drug-drug interaction, market basket analysis, rule discovery, important bio-elements

Procedia PDF Downloads 291
823 Computational Agent-Based Approach for Addressing the Consequences of Releasing Gene Drive Mosquito to Control Malaria

Authors: Imran Hashmi, Sipkaduwa Arachchige Sashika Sureni Wickramasooriya

Abstract:

Gene-drive technology has emerged as a promising tool for disease control by influencing the population dynamics of disease-carrying organisms. Various gene drive mechanisms, derived from global laboratory experiments, aim to strategically manage and prevent the spread of targeted diseases. One prominent strategy involves population replacement, wherein genetically modified mosquitoes are introduced to replace the existing local wild population. To enhance our understanding and aid in the design of effective release strategies, we employ a comprehensive mathematical model. The utilized approach employs agent-based modeling, enabling the consideration of individual mosquito attributes and flexibility in parameter manipulation. Through the integration of an agent-based model and a meta-population spatial approach, the dynamics of gene drive mosquito spreading in a released site are simulated. The model's outcomes offer valuable insights into future population dynamics, providing guidance for the development of informed release strategies. This research significantly contributes to the ongoing discourse on the responsible and effective implementation of gene drive technology for disease vector control.

Keywords: gene drive, agent-based modeling, disease-carrying organisms, malaria

Procedia PDF Downloads 44
822 Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.

Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure

Procedia PDF Downloads 85
821 Effective Stiffness, Permeability, and Reduced Wall Shear Stress of Highly Porous Tissue Engineering Scaffolds

Authors: Hassan Mohammadi Khujin

Abstract:

Tissue engineering is the science of tissues and complex organs creation using scaffolds, cells and biologically active components. Most cells require scaffolds to grow and proliferate. These temporary support structures for tissue regeneration are later replaced with extracellular matrix produced inside the body. Recent advances in additive manufacturing methods allow production of highly porous, complex three dimensional scaffolds suitable for cell growth and proliferation. The current paper investigates the mechanical properties, including elastic modulus and compressive strength, as well as fluid flow dynamics, including permeability and flow-induced shear stress of scaffolds with four triply periodic minimal surface (TPMS) configurations, namely the Schwarz primitive, the Schwarz diamond, the gyroid, and the Neovius structures. Higher porosity in all scaffold types resulted in lower mechanical properties. The permeability of the scaffolds was determined using Darcy's law with reference to geometrical parameters and the pressure drop derived from the computational fluid dynamics (CFD) analysis. Higher porosity enhanced permeability and reduced wall shear stress in all scaffold designs.

Keywords: highly porous scaffolds, tissue engineering, finite elements analysis, CFD analysis

Procedia PDF Downloads 54
820 The Utilization of FSI Technique and Two-Way Particle Coupling System on Particle Dynamics in the Human Alveoli

Authors: Hassan Athari, Abdurrahim Bolukbasi, Dogan Ciloglu

Abstract:

This study represented the respiratory alveoli system, and determined the trajectory of inhaled particles more accurately using the modified three-dimensional model with deformable walls of alveoli. The study also considered the tissue tension in the model to demonstrate the effect of lung. Tissue tensions are transferred by the lung parenchyma and produce the pressure gradient. This load expands the alveoli and establishes a sub-ambient (vacuum) pressure within the lungs. Thus, at the alveolar level, the flow field and movement of alveoli wall lead to an integrated effect. In this research, we assume that the three-dimensional alveolus has a visco-elastic tissue (walls). For accurate investigation of pulmonary tissue mechanical properties on particle transport and alveolar flow field, the actual relevance between tissue movement and airflow is solved by two-way FSI (Fluid Structure Interaction) simulation technique in the alveolus. Therefore, the essence of real simulation of pulmonary breathing mechanics can be achieved by developing a coupled FSI computational model. We, therefore conduct a series of FSI simulations over a range of tissue models and breathing rates. As a result, the fluid flows and streamlines have changed during present flexible model against the rigid models and also the two-way coupling particle trajectories have changed against the one-way particle coupling.

Keywords: FSI, two-way particle coupling, alveoli, CDF

Procedia PDF Downloads 230
819 Teachers’ Protective Factors of Resilience Scale: Factorial Structure, Validity and Reliability Issues

Authors: Athena Daniilidou, Maria Platsidou

Abstract:

Recently developed scales addressed -specifically- teachers’ resilience. Although they profited from the field, they do not include some of the critical protective factors of teachers’ resilience identified in the literature. To address this limitation, we aimed at designing a more comprehensive scale for measuring teachers' resilience which encompasses various personal and environmental protective factors. To this end, two studies were carried out. In Study 1, 407 primary school teachers were tested with the new scale, the Teachers’ Protective Factors of Resilience Scale (TPFRS). Similar scales, such as the Multidimensional Teachers’ Resilience Scale and the Teachers’ Resilience Scale), were used to test the convergent validity, while the Maslach Burnout Inventory and the Teachers’ Sense of Efficacy Scale was used to assess the discriminant validity of the new scale. The factorial structure of the TPFRS was checked with confirmatory factor analysis and a good fit of the model to the data was found. Next, item response theory analysis using a two-parameter model (2PL) was applied to check the items within each factor. It revealed that 9 items did not fit the corresponding factors well and they were removed. The final version of the TPFRS includes 29 items, which assess six protective factors of teachers’ resilience: values and beliefs (5 items, α=.88), emotional and behavioral adequacy (6 items, α=.74), physical well-being (3 items, α=.68), relationships within the school environment, (6 items, α=.73) relationships outside the school environment (5 items, α=.84), and the legislative framework of education (4 items, α=.83). Results show that it presents a satisfactory convergent and discriminant validity. Study 2, in which 964 primary and secondary school teachers were tested, confirmed the factorial structure of the TPFRS as well as its discriminant validity, which was tested with the Schutte Emotional Intelligence Scale-Short Form. In conclusion, our results confirmed that the TPFRS is a valid instrument for assessing teachers' protective factors of resilience and it can be safely used in future research and interventions in the teaching profession. In conclusion, our results showed that the TPFRS is a new multi-dimensional instrument valid for assessing teachers' protective factors of resilience and it can be safely used in future research and interventions in the teaching profession.

Keywords: resilience, protective factors, teachers, item response theory

Procedia PDF Downloads 66
818 The Effect of Action Potential Duration and Conduction Velocity on Cardiac Pumping Efficacy: Simulation Study

Authors: Ana Rahma Yuniarti, Ki Moo Lim

Abstract:

Slowed myocardial conduction velocity (CV) and shortened action potential duration (APD) due to some reason are associated with an increased risk of re-entrant excitation, predisposing to cardiac arrhythmia. That is because both of CV reduction and APD shortening induces shortening of wavelength. In this study, we investigated quantitatively the cardiac mechanical responses under various CV and APD using multi-scale computational model of the heart. The model consisted of electrical model coupled with the mechanical contraction model together with a lumped model of the circulatory system. The electrical model consisted of 149.344 numbers of nodes and 183.993 numbers of elements of tetrahedral mesh, whereas the mechanical model consisted of 356 numbers of nodes and 172 numbers of elements of hexahedral mesh with hermite basis. We performed the electrical simulation with two scenarios: 1) by varying the CV values with constant APD and 2) by varying the APD values with constant CV. Then, we compared the electrical and mechanical responses for both scenarios. Our simulation showed that faster CV and longer APD induced largest resultants wavelength and generated better cardiac pumping efficacy by increasing the cardiac output and consuming less energy. This is due to the long wave propagation and faster conduction generated more synchronous contraction of whole ventricle.

Keywords: conduction velocity, action potential duration, mechanical contraction model, circulatory model

Procedia PDF Downloads 182
817 A Heuristic Based Decomposition Approach for a Hierarchical Production Planning Problem

Authors: Nusrat T. Chowdhury, M. F. Baki, A. Azab

Abstract:

The production planning problem is concerned with specifying the optimal quantities to produce in order to meet the demand for a prespecified planning horizon with the least possible expenditure. Making the right decisions in production planning will affect directly the performance and productivity of a manufacturing firm, which is important for its ability to compete in the market. Therefore, developing and improving solution procedures for production planning problems is very significant. In this paper, we develop a Dantzig-Wolfe decomposition of a multi-item hierarchical production planning problem with capacity constraint and present a column generation approach to solve the problem. The original Mixed Integer Linear Programming model of the problem is decomposed item by item into a master problem and a number of subproblems. The capacity constraint is considered as the linking constraint between the master problem and the subproblems. The subproblems are solved using the dynamic programming approach. We also propose a multi-step iterative capacity allocation heuristic procedure to handle any kind of infeasibility that arises while solving the problem. We compare the computational performance of the developed solution approach against the state-of-the-art heuristic procedure available in the literature. The results show that the proposed heuristic-based decomposition approach improves the solution quality by 20% as compared to the literature.

Keywords: inventory, multi-level capacitated lot-sizing, emission control, setup carryover

Procedia PDF Downloads 113