Search results for: dual phase lag model
17000 Elastic and Plastic Collision Comparison Using Finite Element Method
Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier
Abstract:
The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.Keywords: collision, impact models, finite element method, Hertz Theory
Procedia PDF Downloads 17616999 A Hybrid Traffic Model for Smoothing Traffic Near Merges
Authors: Shiri Elisheva Decktor, Sharon Hornstein
Abstract:
Highway merges and unmarked junctions are key components in any urban road network, which can act as bottlenecks and create traffic disruption. Inefficient highway merges may trigger traffic instabilities such as stop-and-go waves, pose safety conditions and lead to longer journey times. These phenomena occur spontaneously if the average vehicle density exceeds a certain critical value. This study focuses on modeling the traffic using a microscopic traffic flow model. A hybrid traffic model, which combines human-driven and controlled vehicles is assumed. The controlled vehicles obey different driving policies when approaching the merge, or in the vicinity of other vehicles. We developed a co-simulation model in SUMO (Simulation of Urban Mobility), in which the human-driven cars are modeled using the IDM model, and the controlled cars are modeled using a dedicated controller. The scenario chosen for this study is a closed track with one merge and one exit, which could be later implemented using a scaled infrastructure on our lab setup. This will enable us to benchmark the results of this study obtained in simulation, to comparable results in similar conditions in the lab. The metrics chosen for the comparison of the performance of our algorithm on the overall traffic conditions include the average speed, wait time near the merge, and throughput after the merge, measured under different travel demand conditions (low, medium, and heavy traffic).Keywords: highway merges, traffic modeling, SUMO, driving policy
Procedia PDF Downloads 10816998 Construction of a Dynamic Migration Model of Extracellular Fluid in Brain for Future Integrated Control of Brain State
Authors: Tomohiko Utsuki, Kyoka Sato
Abstract:
In emergency medicine, it is recognized that brain resuscitation is very important for the reduction of mortality rate and neurological sequelae. Especially, the control of brain temperature (BT), intracranial pressure (ICP), and cerebral blood flow (CBF) are most required for stabilizing brain’s physiological state in the treatment for such as brain injury, stroke, and encephalopathy. However, the manual control of BT, ICP, and CBF frequently requires the decision and operation of medical staff, relevant to medication and the setting of therapeutic apparatus. Thus, the integration and the automation of the control of those is very effective for not only improving therapeutic effect but also reducing staff burden and medical cost. For realizing such integration and automation, a mathematical model of brain physiological state is necessary as the controlled object in simulations, because the performance test of a prototype of the control system using patients is not ethically allowed. A model of cerebral blood circulation has already been constructed, which is the most basic part of brain physiological state. Also, a migration model of extracellular fluid in brain has been constructed, however the condition that the total volume of intracranial cavity is almost changeless due to the hardness of cranial bone has not been considered in that model. Therefore, in this research, the dynamic migration model of extracellular fluid in brain was constructed on the consideration of the changelessness of intracranial cavity’s total volume. This model is connectable to the cerebral blood circulation model. The constructed model consists of fourteen compartments, twelve of which corresponds to perfused area of bilateral anterior, middle and posterior cerebral arteries, the others corresponds to cerebral ventricles and subarachnoid space. This model enable to calculate the migration of tissue fluid from capillaries to gray matter and white matter, the flow of tissue fluid between compartments, the production and absorption of cerebrospinal fluid at choroid plexus and arachnoid granulation, and the production of metabolic water. Further, the volume, the colloid concentration, and the tissue pressure of/in each compartment are also calculable by solving 40-dimensional non-linear simultaneous differential equations. In this research, the obtained model was analyzed for its validation under the four condition of a normal adult, an adult with higher cerebral capillary pressure, an adult with lower cerebral capillary pressure, and an adult with lower colloid concentration in cerebral capillary. In the result, calculated fluid flow, tissue volume, colloid concentration, and tissue pressure were all converged to suitable value for the set condition within 60 minutes at a maximum. Also, because these results were not conflict with prior knowledge, it is certain that the model can enough represent physiological state of brain under such limited conditions at least. One of next challenges is to integrate this model and the already constructed cerebral blood circulation model. This modification enable to simulate CBF and ICP more precisely due to calculating the effect of blood pressure change to extracellular fluid migration and that of ICP change to CBF.Keywords: dynamic model, cerebral extracellular migration, brain resuscitation, automatic control
Procedia PDF Downloads 15916997 A Fuzzy Linear Regression Model Based on Dissemblance Index
Authors: Shih-Pin Chen, Shih-Syuan You
Abstract:
Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.Keywords: dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming
Procedia PDF Downloads 44616996 Mathematical Model of Corporate Bond Portfolio and Effective Border Preview
Authors: Sergey Podluzhnyy
Abstract:
One of the most important tasks of investment and pension fund management is building decision support system which helps to make right decision on corporate bond portfolio formation. Today there are several basic methods of bond portfolio management. They are duration management, immunization and convexity management. Identified methods have serious disadvantage: they do not take into account credit risk or insolvency risk of issuer. So, identified methods can be applied only for management and evaluation of high-quality sovereign bonds. Applying article proposes mathematical model for building an optimal in case of risk and yield corporate bond portfolio. Proposed model takes into account the default probability in formula of assessment of bonds which results to more correct evaluation of bonds prices. Moreover, applied model provides tools for visualization of the efficient frontier of corporate bonds portfolio taking into account the exposure to credit risk, which will increase the quality of the investment decisions of portfolio managers.Keywords: corporate bond portfolio, default probability, effective boundary, portfolio optimization task
Procedia PDF Downloads 32216995 Challenging Barriers to the Evolution of the Saudi Animation Industry Life-Cycle
Authors: Ohud Alharbi, Emily Baines
Abstract:
The animation industry is one of the creative industries that have attracted recent historiographical attention. However, there has been very limited research on Saudi Arabian and wider Arabian animation industries, while there are a large number of studies that have covered this issue for North America, Europe and East Asia. The existing studies show that developed countries such as USA, Japan and the UK have reached the Maturity stage in their animation industry life-cycle. On the other hand, developing countries that are still in the Introduction phase of the industry life-cycle face challenges to improve their industry. Saudi Arabia is one of the countries whose animation industry is still in its infancy. Thus, the aim of this paper is to address the main barriers that hinder the evolution of the industry life-cycle for Saudi animation – challenges that are also relevant to many other early stage industries in developing countries. These barriers have been analysed using the early mobility barriers defined by Porter, to provide a conceptual structure for defining recommendations to enable the transition to a strong Growth phase industry. This study utilized qualitative methods to collect data, which involved in-depth interviews, document analysis and observations. It also undertook a comparative case study approach to investigate the animation industry life-cycle, with three selected case studies that have a more developed industry than Saudi animation. Case studies include: the United Kingdom, which represents a Mature animation industry; Egypt, which represents an established Growth stage industry; and the United Arab of Emirates, which is an early Growth stage industry. This study suggests adopting appropriate strategies that arise as findings from the comparative case studies, to overcome barriers and facilitate the growth of the Saudi animation industry.Keywords: barriers, industry life-cycle, Saudi animation, industry
Procedia PDF Downloads 58416994 Human Brain Organoids-on-a-Chip Systems to Model Neuroinflammation
Authors: Feng Guo
Abstract:
Human brain organoids, 3D brain tissue cultures derived from human pluripotent stem cells, hold promising potential in modeling neuroinflammation for a variety of neurological diseases. However, challenges remain in generating standardized human brain organoids that can recapitulate key physiological features of a human brain. Here, this study presents a series of organoids-on-a-chip systems to generate better human brain organoids and model neuroinflammation. By employing 3D printing and microfluidic 3D cell culture technologies, the study’s systems enable the reliable, scalable, and reproducible generation of human brain organoids. Compared with conventional protocols, this study’s method increased neural progenitor proliferation and reduced heterogeneity of human brain organoids. As a proof-of-concept application, the study applied this method to model substance use disorders.Keywords: human brain organoids, microfluidics, organ-on-a-chip, neuroinflammation
Procedia PDF Downloads 20516993 Generation of Charged Nanoparticles and Their Contribution to the Thin Film and Nanowire Growth during Chemical Vapour Deposition
Authors: Seung-Min Yang, Seong-Han Park, Sang-Hoon Lee, Seung-Wan Yoo, Chan-Soo Kim, Nong-Moon Hwang
Abstract:
The theory of charged nanoparticles suggested that in many Chemical Vapour Depositions (CVD) processes, Charged Nanoparticles (CNPs) are generated in the gas-phase and become a building block of thin films and nanowires. Recently, the nanoparticle-based crystallization has become a big issue since the growth of nanorods or crystals by the building block of nanoparticles was directly observed by transmission electron microscopy observations in the liquid cell. In an effort to confirm charged gas-phase nuclei, that might be generated under conventional processing conditions of thin films and nanowires during CVD, we performed an in-situ measurement using differential mobility analyser and particle beam mass spectrometer. The size distribution and number density of CNPs were affected by process parameters such as precursor flow rate and working temperature. It was shown that many films and nanostructures, which have been believed to grow by individual atoms or molecules, actually grow by the building blocks of such charged nuclei. The electrostatic interaction between CNPs and the growing surface induces the self-assembly into films and nanowires. In addition, the charge-enhanced atomic diffusion makes CNPs liquid-like quasi solid. As a result, CNPs tend to land epitaxial on the growing surface, which results in the growth of single crystalline nanowires with a smooth surface.Keywords: chemical vapour deposition, charged nanoparticle, electrostatic force, nanostructure evolution, differential mobility analyser, particle beam mass spectrometer
Procedia PDF Downloads 45816992 Computer-Based Model for Design Selection of Lightning Arrester for 132/33kV Substation
Authors: Uma U. Uma, Uzoechi Laz
Abstract:
Protection of equipment insulation against lightning over voltages and selection of lightning arrester that will discharge at lower voltage level than the voltage required to breakdown the electrical equipment insulation is examined. The objectives of this paper are to design a computer based model using standard equations for the selection of appropriate lightning arrester with the lowest rated surge arrester that will provide adequate protection of equipment insulation and equally have a satisfactory service life when connected to a specified line voltage in power system network. The effectiveness and non-effectiveness of the earthing system of substation determine arrester properties. MATLAB program with GUI (graphic user interphase) its subprogram is used in the development of the model for the determination of required parameters like voltage rating, impulse spark over voltage, power frequency spark over voltage, discharge current, current rating and protection level of lightning arrester of a specified voltage level of a particular line.Keywords: lightning arrester, GUIs, MatLab program, computer based model
Procedia PDF Downloads 42216991 An Optimal Bayesian Maintenance Policy for a Partially Observable System Subject to Two Failure Modes
Authors: Akram Khaleghei Ghosheh Balagh, Viliam Makis, Leila Jafari
Abstract:
In this paper, we present a new maintenance model for a partially observable system subject to two failure modes, namely a catastrophic failure and a failure due to the system degradation. The system is subject to condition monitoring and the degradation process is described by a hidden Markov model. A cost-optimal Bayesian control policy is developed for maintaining the system. The control problem is formulated in the semi-Markov decision process framework. An effective computational algorithm is developed and illustrated by a numerical example.Keywords: partially observable system, hidden Markov model, competing risks, multivariate Bayesian control
Procedia PDF Downloads 46116990 Modeling of a Pilot Installation for the Recovery of Residual Sludge from Olive Oil Extraction
Authors: Riad Benelmir, Muhammad Shoaib Ahmed Khan
Abstract:
The socio-economic importance of the olive oil production is significant in the Mediterranean region, both in terms of wealth and tradition. However, the extraction of olive oil generates huge quantities of wastes that may have a great impact on land and water environment because of their high phytotoxicity. Especially olive mill wastewater (OMWW) is one of the major environmental pollutants in olive oil industry. This work projects to design a smart and sustainable integrated thermochemical catalytic processes of residues from olive mills by hydrothermal carbonization (HTC) of olive mill wastewater (OMWW) and fast pyrolysis of olive mill wastewater sludge (OMWS). The byproducts resulting from OMWW-HTC treatment are a solid phase enriched in carbon, called biochar and a liquid phase (residual water with less dissolved organic and phenolic compounds). HTC biochar can be tested as a fuel in combustion systems and will also be utilized in high-value applications, such as soil bio-fertilizer and as catalyst or/and catalyst support. The HTC residual water is characterized, treated and used in soil irrigation since the organic and the toxic compounds will be reduced under the permitted limits. This project’s concept includes also the conversion of OMWS to a green diesel through a catalytic pyrolysis process. The green diesel is then used as biofuel in an internal combustion engine (IC-Engine) for automotive application to be used for clean transportation. In this work, a theoretical study is considered for the use of heat from the pyrolysis non-condensable gases in a sorption-refrigeration machine for pyrolysis gases cooling and condensation of bio-oil vapors.Keywords: biomass, olive oil extraction, adsorption cooling, pyrolisis
Procedia PDF Downloads 9616989 Target and Equalizer Design for Perpendicular Heat-Assisted Magnetic Recording
Authors: P. Tueku, P. Supnithi, R. Wongsathan
Abstract:
Heat-Assisted Magnetic Recording (HAMR) is one of the leading technologies identified to enable areal density beyond 1 Tb/in2 of magnetic recording systems. A key challenge to HAMR designing is accuracy of positioning, timing of the firing laser, power of the laser, thermo-magnetic head, head-disk interface and cooling system. We study the effect of HAMR parameters on transition center and transition width. The HAMR is model using Thermal Williams-Comstock (TWC) and microtrack model. The target and equalizer are designed by the minimum mean square error (MMSE). The result shows that the unit energy constraint outperforms other constraints.Keywords: heat-assisted magnetic recording, thermal Williams-Comstock equation, microtrack model, equalizer
Procedia PDF Downloads 35716988 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)
Authors: Robert Jacobsen
Abstract:
Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.Keywords: hydrology, mapping, high-definition, inundation
Procedia PDF Downloads 8516987 Seaweed as a Future Fuel Option: Potential and Conversion Technologies
Authors: Muhammad Rizwan Tabassum, Ao Xia, Jerry D. Murphy
Abstract:
The purpose of this work is to provide a comprehensive overview of seaweed as the alternative feedstock for biofuel production and key conversion technologies. Resource depletion and climate change are the driving forces to hunt for renewable sources of energy. Macroalgae can be preferred over land based crops for biofuel production because they are not in competition with food crops for arable land, high growth rates and low lignin contents which require less energy-intensive pre-treatments. However, some disadvantages, such as high moisture content, seasonal variation in chemical composition and process inhibition limit its economic feasibility. Seaweed can be converted into gaseous and liquid fuel by different conversion technologies, but biogas via anaerobic digestion from seaweed is attracting increased attention due to its dual benefit of an economic source of bio-fuel and environment-friendly technology. Biodiesel and bioethanol conversion technologies from seaweed are still under development. A selection of high yielding seaweed species, optimal harvesting season and process optimization make them economically feasible for the alternative source of renewable and sustainable feedstock for biofuel in future.Keywords: anaerobic digestion, biofuel, bio-methane, conversion technologies, seaweed
Procedia PDF Downloads 47516986 Tibyan Automated Arabic Correction Using Machine-Learning in Detecting Syntactical Mistakes
Authors: Ashwag O. Maghraby, Nida N. Khan, Hosnia A. Ahmed, Ghufran N. Brohi, Hind F. Assouli, Jawaher S. Melibari
Abstract:
The Arabic language is one of the most important languages. Learning it is so important for many people around the world because of its religious and economic importance and the real challenge lies in practicing it without grammatical or syntactical mistakes. This research focused on detecting and correcting the syntactic mistakes of Arabic syntax according to their position in the sentence and focused on two of the main syntactical rules in Arabic: Dual and Plural. It analyzes each sentence in the text, using Stanford CoreNLP morphological analyzer and machine-learning approach in order to detect the syntactical mistakes and then correct it. A prototype of the proposed system was implemented and evaluated. It uses support vector machine (SVM) algorithm to detect Arabic grammatical errors and correct them using the rule-based approach. The prototype system has a far accuracy 81%. In general, it shows a set of useful grammatical suggestions that the user may forget about while writing due to lack of familiarity with grammar or as a result of the speed of writing such as alerting the user when using a plural term to indicate one person.Keywords: Arabic language acquisition and learning, natural language processing, morphological analyzer, part-of-speech
Procedia PDF Downloads 15716985 Random Subspace Ensemble of CMAC Classifiers
Authors: Somaiyeh Dehghan, Mohammad Reza Kheirkhahan Haghighi
Abstract:
The rapid growth of domains that have data with a large number of features, while the number of samples is limited has caused difficulty in constructing strong classifiers. To reduce the dimensionality of the feature space becomes an essential step in classification task. Random subspace method (or attribute bagging) is an ensemble classifier that consists of several classifiers that each base learner in ensemble has subset of features. In the present paper, we introduce Random Subspace Ensemble of CMAC neural network (RSE-CMAC), each of which has training with subset of features. Then we use this model for classification task. For evaluation performance of our model, we compare it with bagging algorithm on 36 UCI datasets. The results reveal that the new model has better performance.Keywords: classification, random subspace, ensemble, CMAC neural network
Procedia PDF Downloads 33616984 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 11216983 ELD79-LGD2006 Transformation Techniques Implementation and Accuracy Comparison in Tripoli Area, Libya
Authors: Jamal A. Gledan, Othman A. Azzeidani
Abstract:
During the last decade, Libya established a new Geodetic Datum called Libyan Geodetic Datum 2006 (LGD 2006) by using GPS, whereas the ground traversing method was used to establish the last Libyan datum which was called the Europe Libyan Datum 79 (ELD79). The current research paper introduces ELD79 to LGD2006 coordinate transformation technique, the accurate comparison of transformation between multiple regression equations and the three-parameters model (Bursa-Wolf). The results had been obtained show that the overall accuracy of stepwise multi regression equations is better than that can be determined by using Bursa-Wolf transformation model.Keywords: geodetic datum, horizontal control points, traditional similarity transformation model, unconventional transformation techniques
Procedia PDF Downloads 31116982 First Digit Lucas, Fibonacci and Benford Number in Financial Statement
Authors: Teguh Sugiarto, Amir Mohamadian Amiri
Abstract:
Background: This study aims to explore if there is fraud in the company's financial report distribution using the number first digit Lucas, Fibonacci and Benford. Research methods: In this study, the author uses a number model contained in the first digit of the model Lucas, Fibonacci and Benford, to make a distinction between implementation by using the scale above and below 5%, the rate of occurrence of a difference against the digit number contained on Lucas, Fibonacci and Benford. If there is a significant difference above and below 5%, then the process of follow-up and detection of occurrence of fraud against the financial statements can be made. Findings: From research that has been done can be concluded that the number of frequency levels contained in the financial statements of PT Bank BRI Tbk in a year in the same conscientious results for model Lucas, Fibonacci and Benford.Keywords: Lucas, Fibonacci, Benford, first digit
Procedia PDF Downloads 27716981 Meta Mask Correction for Nuclei Segmentation in Histopathological Image
Authors: Jiangbo Shi, Zeyu Gao, Chen Li
Abstract:
Nuclei segmentation is a fundamental task in digital pathology analysis and can be automated by deep learning-based methods. However, the development of such an automated method requires a large amount of data with precisely annotated masks which is hard to obtain. Training with weakly labeled data is a popular solution for reducing the workload of annotation. In this paper, we propose a novel meta-learning-based nuclei segmentation method which follows the label correction paradigm to leverage data with noisy masks. Specifically, we design a fully conventional meta-model that can correct noisy masks by using a small amount of clean meta-data. Then the corrected masks are used to supervise the training of the segmentation model. Meanwhile, a bi-level optimization method is adopted to alternately update the parameters of the main segmentation model and the meta-model. Extensive experimental results on two nuclear segmentation datasets show that our method achieves the state-of-the-art result. In particular, in some noise scenarios, it even exceeds the performance of training on supervised data.Keywords: deep learning, histopathological image, meta-learning, nuclei segmentation, weak annotations
Procedia PDF Downloads 14316980 A Multi-Attribute Utility Model for Performance Evaluation of Sustainable Banking
Authors: Sonia Rebai, Mohamed Naceur Azaiez, Dhafer Saidane
Abstract:
In this study, we develop a performance evaluation model based on a multi-attribute utility approach aiming at reaching the sustainable banking (SB) status. This model is built accounting for various banks’ stakeholders in a win-win paradigm. In addition, it offers the opportunity for adopting a global measure of performance as an indication of a bank’s sustainability degree. This measure is referred to as banking sustainability performance index (BSPI). This index may constitute a basis for ranking banks. Moreover, it may constitute a bridge between the assessment types of financial and extra-financial rating agencies. A real application is performed on three French banks.Keywords: multi-attribute utility theory, performance, sustainable banking, financial rating
Procedia PDF Downloads 47116979 Early Design Prediction of Submersible Maneuvers
Authors: Hernani Brinati, Mardel de Conti, Moyses Szajnbok, Valentina Domiciano
Abstract:
This study brings a mathematical model and examples for the numerical prediction of submersible maneuvers in the horizontal and in the vertical planes. The geometry of the submarine is here taken as a body of revolution plus a sail, two horizontal and two vertical rudders. The model includes the representation of the hull resistance and of the propeller thrust and torque, what enables to consider the variation of the longitudinal component of the velocity of the ship when maneuvering. The hydrodynamic forces are represented through power series expansions of the acceleration and velocity components. The hydrodynamic derivatives for the body of revolution are mostly estimated based on fundamental principles applicable to the flow around airplane fuselages in the subsonic regime. The hydrodynamic forces for the sail and rudders are estimated based on a finite aspect ratio wing theory. The objective of this study is to build an expedite model for submarine maneuvers prediction, based on fundamental principles, which may be convenient in the early stages of the ship design. This model is tested against available numerical and experimental data.Keywords: submarine maneuvers, submarine, maneuvering, dynamics
Procedia PDF Downloads 64116978 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics
Authors: Hongliang Zhang
Abstract:
The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.Keywords: cybertext, digital poetry, poetry generator, semiotics
Procedia PDF Downloads 17916977 Application of Molecular Materials in the Manufacture of Flexible and Organic Devices for Photovoltaic Applications
Authors: Mariana Gomez Gomez, Maria Elena Sanchez Vergara
Abstract:
Many sustainable approaches to generate electric energy have emerged in the last few decades; one of them is through solar cells. Yet, this also has the disadvantage of highly polluting inorganic semiconductor manufacturing processes. Therefore, the use of molecular semiconductors must be considered. In this work, allene compounds C24H26O4 and C24H26O5 were used as dopants to manufacture semiconductors films based on PbPc by high-vacuum evaporation technique. IR spectroscopy was carried out to determine the phase and any significant chemical changes which may occur during the thermal evaporation. According to UV-visible spectroscopy and Tauc’s model, the deposition process generated thin films with an activation energy range of 1.47 to 1.55 eV for direct transitions and 1.29 to 1.33 eV for indirect transitions. These values place the manufactured films within the range of low bandgap semiconductors. The flexible devices were manufactured: polyethylene terephthalate (PET), Indium tin oxide (ITO)/organic semiconductor/ Cubic Close Packed (CCP). The characterization of the devices was carried out by evaluating electrical conductivity using the four-probe collinear method. I-V curves were obtained under different lighting conditions at room temperature. OS1 (PbPc/C24H26O4) showed an Ohmic behavior, while OS2 (PbPc/C24H26O5) reached higher current values at lower voltages. The results obtained show that the semiconductors devices doped with allene compounds can be used in the manufacture of optoelectronic devices.Keywords: electrical properties, optical gap, phthalocyanine, thin film.
Procedia PDF Downloads 25416976 Military Use of Artificial Intelligence under International Humanitarian Law: Insights from Canada
Authors: Mahshid TalebianKiakalayeh
Abstract:
As AI technologies can be used by both civilians and soldiers, it is vital to consider the consequences emanating from AI military as well as civilian use. Indeed, many of the same technologies can have a dual-use. This paper will explore the military uses of AI and assess its compliance with international legal norms. AI developments not only have changed the capacity of the military to conduct complex operations but have also increased legal concerns. The existence of a potential legal vacuum in legal principles on the military use of AI indicates the necessity of more study on compliance with International Humanitarian Law (IHL), the branch of international law which governs the conduct of hostilities. While capabilities of new means of military AI continue to advance at incredible rates, this body of law is seeking to limit the methods of warfare protecting civilian persons who are not participating in an armed conflict. Implementing AI in the military realm would result in potential issues, including ethical and legal challenges. For instance, when intelligence can perform any warfare task without any human involvement, a range of humanitarian debates will be raised as to whether this technology might distinguish between military and civilian targets or not. This is mainly because AI in fully military systems would not seem to carry legal and ethical judgment, which can interfere with IHL principles. The paper will take, as a case study, Canada’s compliance with IHL in the area of AI and the related legal issues that are likely to arise as this country continues to develop military uses of AI.Keywords: artificial intelligence, military use, international humanitarian law, the Canadian perspective
Procedia PDF Downloads 19316975 Median-Based Nonparametric Estimation of Returns in Mean-Downside Risk Portfolio Frontier
Authors: H. Ben Salah, A. Gannoun, C. de Peretti, A. Trabelsi
Abstract:
The Downside Risk (DSR) model for portfolio optimisation allows to overcome the drawbacks of the classical mean-variance model concerning the asymetry of returns and the risk perception of investors. This model optimization deals with a positive definite matrix that is endogenous with respect to portfolio weights. This aspect makes the problem far more difficult to handle. For this purpose, Athayde (2001) developped a new recurcive minimization procedure that ensures the convergence to the solution. However, when a finite number of observations is available, the portfolio frontier presents an appearance which is not very smooth. In order to overcome that, Athayde (2003) proposed a mean kernel estimation of the returns, so as to create a smoother portfolio frontier. This technique provides an effect similar to the case in which we had continuous observations. In this paper, taking advantage on the the robustness of the median, we replace the mean estimator in Athayde's model by a nonparametric median estimator of the returns. Then, we give a new version of the former algorithm (of Athayde (2001, 2003)). We eventually analyse the properties of this improved portfolio frontier and apply this new method on real examples.Keywords: Downside Risk, Kernel Method, Median, Nonparametric Estimation, Semivariance
Procedia PDF Downloads 49516974 Integration of Hybrid PV-Wind in Three Phase Grid System Using Fuzzy MPPT without Battery Storage for Remote Area
Authors: Thohaku Abdul Hadi, Hadyan Perdana Putra, Nugroho Wicaksono, Adhika Prajna Nandiwardhana, Onang Surya Nugroho, Heri Suryoatmojo, Soedibjo
Abstract:
Access to electricity is now a basic requirement of mankind. Unfortunately, there are still many places around the world which have no access to electricity, such as small islands, where there could potentially be a factory, a plantation, a residential area, or resorts. Many of these places might have substantial potential for energy generation such us Photovoltaic (PV) and Wind turbine (WT), which can be used to generate electricity independently for themselves. Solar energy and wind power are renewable energy sources which are mostly found in nature and also kinds of alternative energy that are still developing in a rapid speed to help and meet the demand of electricity. PV and Wind has a characteristic of power depend on solar irradiation and wind speed based on geographical these areas. This paper presented a control methodology of hybrid small scale PV/Wind energy system that use a fuzzy logic controller (FLC) to extract the maximum power point tracking (MPPT) in different solar irradiation and wind speed. This paper discusses simulation and analysis of the generation process of hybrid resources in MPP and power conditioning unit (PCU) of Photovoltaic (PV) and Wind Turbine (WT) that is connected to the three-phase low voltage electricity grid system (380V) without battery storage. The capacity of the sources used is 2.2 kWp PV and 2.5 kW PMSG (Permanent Magnet Synchronous Generator) -WT power rating. The Modeling of hybrid PV/Wind, as well as integrated power electronics components in grid connected system, are simulated using MATLAB/Simulink.Keywords: fuzzy MPPT, grid connected inverter, photovoltaic (PV), PMSG wind turbine
Procedia PDF Downloads 35716973 Optimized Text Summarization Model on Mobile Screens for Sight-Interpreters: An Empirical Study
Authors: Jianhua Wang
Abstract:
To obtain key information quickly from long texts on small screens of mobile devices, sight-interpreters need to establish optimized summarization model for fast information retrieval. Four summarization models based on previous studies were studied including title+key words (TKW), title+topic sentences (TTS), key words+topic sentences (KWTS) and title+key words+topic sentences (TKWTS). Psychological experiments were conducted on the four models for three different genres of interpreting texts to establish the optimized summarization model for sight-interpreters. This empirical study shows that the optimized summarization model for sight-interpreters to quickly grasp the key information of the texts they interpret is title+key words (TKW) for cultural texts, title+key words+topic sentences (TKWTS) for economic texts and topic sentences+key words (TSKW) for political texts.Keywords: different genres, mobile screens, optimized summarization models, sight-interpreters
Procedia PDF Downloads 31916972 Overcoming the Impacts of Covid-19 Outbreak Using Value Integrated Project Delivery Model
Authors: G. Ramya
Abstract:
Value engineering is a systematic approach, widely used to optimize the design or process or product in the designing stage. It used to achieve the client's obligation by increasing the functionality and attain the targeted cost in the cost planning. Value engineering effectiveness and benefits decrease along with the progress of the project since the change in the scope of the work and design will account for more cost all along the lifecycle of the project. Integrating the value engineering with other project management activities will promote cost minimization, client satisfaction, and ensure early completion of the project in time. Previous research studies suggested that value engineering can integrate with other project delivery activities, but research studies unable to frame a model that collaborates the project management activities with the job plan of value engineering approach. I analyzed various project management activities and their synergy between each other. The project management activities and processes like a)risk analysis b)lifecycle cost analysis c)lean construction d)facility management e)Building information modelling f)Contract administration, collaborated, and project delivery model planned along with the RIBA plan of work. The key outcome of the research is a value-driven project delivery model, which will succeed in dealing with the economic impact, constraints and conflicts arise due to the COVID-19 outbreak in the Indian construction sector. Benefits associated with the structured framework is construction project delivery that ensures early contractor involvement, mutual risk sharing, and reviving the project with a cost overrun and delay back on track ,are discussed. Keywords: Value-driven project delivery model, Integration, RIBA plan of work Themes: Design EconomicsKeywords: value-driven project delivery model, Integration, RIBA
Procedia PDF Downloads 12316971 Model for Calculating Traffic Mass and Deceleration Delays Based on Traffic Field Theory
Authors: Liu Canqi, Zeng Junsheng
Abstract:
This study identifies two typical bottlenecks that occur when a vehicle cannot change lanes: car following and car stopping. The ideas of traffic field and traffic mass are presented in this work. When there are other vehicles in front of the target vehicle within a particular distance, a force is created that affects the target vehicle's driving speed. The characteristics of the driver and the vehicle collectively determine the traffic mass; the driving speed of the vehicle and external variables have no bearing on this. From a physical level, this study examines the vehicle's bottleneck when following a car, identifies the outside factors that have an impact on how it drives, takes into account that the vehicle will transform kinetic energy into potential energy during deceleration, and builds a calculation model for traffic mass. The energy-time conversion coefficient is created from an economic standpoint utilizing the social average wage level and the average cost of motor fuel. Vissim simulation program measures the vehicle's deceleration distance and delays under the Wiedemann car-following model. The difference between the measured value of deceleration delay acquired by simulation and the theoretical value calculated by the model is compared using the conversion calculation model of traffic mass and deceleration delay. The experimental data demonstrate that the model is reliable since the error rate between the theoretical calculation value of the deceleration delay obtained by the model and the measured value of simulation results is less than 10%. The article's conclusion is that the traffic field has an impact on moving cars on the road and that physical and socioeconomic factors should be taken into account while studying vehicle-following behavior. The deceleration delay value of a vehicle's driving and traffic mass have a socioeconomic relationship that can be utilized to calculate the energy-time conversion coefficient when dealing with the bottleneck of cars stopping and starting.Keywords: traffic field, social economics, traffic mass, bottleneck, deceleration delay
Procedia PDF Downloads 71