Search results for: robust model predictive control
21099 The 5-HT1A Receptor Biased Agonists, NLX-101 and NLX-204, Elicit Rapid-Acting Antidepressant Activity in Rat Similar to Ketamine and via GABAergic Mechanisms
Authors: A. Newman-Tancredi, R. Depoortère, P. Gruca, E. Litwa, M. Lason, M. Papp
Abstract:
The N-methyl-D-aspartic acid (NMDA) receptor antagonist, ketamine, can elicit rapid-acting antidepressant (RAAD) effects in treatment-resistant patients, but it requires parenteral co-administration with a classical antidepressant under medical supervision. In addition, ketamine can also produce serious side effects that limit its long-term use, and there is much interest in identifying RAADs based on ketamine’s mechanism of action but with safer profiles. Ketamine elicits GABAergic interneuron inhibition, glutamatergic neuron stimulation, and, notably, activation of serotonin 5-HT1A receptors in the prefrontal cortex (PFC). Direct activation of the latter receptor subpopulation with selective ‘biased agonists’ may therefore be a promising strategy to identify novel RAADs and, consistent with this hypothesis, the prototypical cortical biased agonist, NLX-101, exhibited robust RAAD-like activity in the chronic mild stress model of depression (CMS). The present study compared the effects of a novel, selective 5-HT1A receptor-biased agonist, NLX-204, with those of ketamine and NLX-101. Materials and methods: CMS procedure was conducted on Wistar rats; drugs were administered either intraperitoneally (i.p.) or by bilateral intracortical microinjection. Ketamine: 10 mg/kg i.p. or 10 µg/side in PFC; NLX-204 and NLX-101: 0.08 and 0.16 mg/kg i.p. or 16 µg/side in PFC. In addition, interaction studies were carried out with systemic NLX-204 or NLX-101 (each at 0.16 mg/kg i.p.) in combination with intracortical WAY-100635 (selective 5-HT1A receptor antagonist; 2 µg/side) or muscimol (GABA-A receptor agonist, 12.5 ng/side). Anhedonia was assessed by CMS-induced decrease in sucrose solution consumption; anxiety-like behavior was assessed using the Elevated Plus Maze (EPM), and cognitive impairment was assessed by the Novel Object Recognition (NOR) test. Results: A single administration of NLX-204 was sufficient to reverse the CMS-induced deficit in sucrose consumption, similarly to ketamine and NLX-101. NLX-204 also reduced CMS-induced anxiety in the EPM and abolished CMS-induced NOR deficits. These effects were maintained (EPM and NOR) or enhanced (sucrose consumption) over a subsequent 2-week period of treatment. The anti-anhedonic response of the drugs was also maintained for several weeks Following treatment discontinuation, suggesting that they had sustained effects on neuronal networks. A single PFC administration of NLX-204 reversed deficient sucrose consumption, similarly to ketamine and NLX-101. Moreover, the anti-anhedonic activities of systemic NLX-204 and NLX 101 were abolished by coadministration with intracortical WAY-100635 or muscimol. Conclusions: (i) The antidepressant-like activity of NLX-204 in the rat CMS model was as rapid as that of ketamine or NLX-101, supporting targeting cortical 5-HT1A receptors with selective, biased agonists to achieve RAAD effects. (ii)The anti-anhedonic activity of systemic NLX-204 was mimicked by local administration of the compound in the PFC, confirming the involvement of cortical circuits in its RAAD-like effects. (iii) Notably, the effects of systemic NLX-204 and NLX-101 were abolished by PFC administration of muscimol, indicating that they act by (indirectly) eliciting a reduction in cortical GABAergic neurotransmission. This is consistent with ketamine’s mechanism of action and suggests that there are converging NMDA and 5-HT1A receptor signaling cascades in PFC underlying the RAAD-like activities of ketamine and NLX-204. Acknowledgements: The study was financially supported by NCN grant no. 2019/35/B/NZ7/00787.Keywords: depression, ketamine, serotonin, 5-HT1A receptor, chronic mild stress
Procedia PDF Downloads 11321098 Off-Topic Text Detection System Using a Hybrid Model
Authors: Usama Shahid
Abstract:
Be it written documents, news columns, or students' essays, verifying the content can be a time-consuming task. Apart from the spelling and grammar mistakes, the proofreader is also supposed to verify whether the content included in the essay or document is relevant or not. The irrelevant content in any document or essay is referred to as off-topic text and in this paper, we will address the problem of off-topic text detection from a document using machine learning techniques. Our study aims to identify the off-topic content from a document using Echo state network model and we will also compare data with other models. The previous study uses Convolutional Neural Networks and TFIDF to detect off-topic text. We will rearrange the existing datasets and take new classifiers along with new word embeddings and implement them on existing and new datasets in order to compare the results with the previously existing CNN model.Keywords: off topic, text detection, eco state network, machine learning
Procedia PDF Downloads 8521097 Model-Based Field Extraction from Different Class of Administrative Documents
Authors: Jinen Daghrir, Anis Kricha, Karim Kalti
Abstract:
The amount of incoming administrative documents is massive and manually processing these documents is a costly task especially on the timescale. In fact, this problem has led an important amount of research and development in the context of automatically extracting fields from administrative documents, in order to reduce the charges and to increase the citizen satisfaction in administrations. In this matter, we introduce an administrative document understanding system. Given a document in which a user has to select fields that have to be retrieved from a document class, a document model is automatically built. A document model is represented by an attributed relational graph (ARG) where nodes represent fields to extract, and edges represent the relation between them. Both of vertices and edges are attached with some feature vectors. When another document arrives to the system, the layout objects are extracted and an ARG is generated. The fields extraction is translated into a problem of matching two ARGs which relies mainly on the comparison of the spatial relationships between layout objects. Experimental results yield accuracy rates from 75% to 100% tested on eight document classes. Our proposed method has a good performance knowing that the document model is constructed using only one single document.Keywords: administrative document understanding, logical labelling, logical layout analysis, fields extraction from administrative documents
Procedia PDF Downloads 21321096 Thermal and Mechanical Finite Element Analysis of a Mineral Casting Machine Frame
Abstract:
Thermal distortion of the machine tool plays a critical role in its machining accuracy. This study investigates the thermal performance of a high-precision machine frame with future-oriented mineral casting components. A thermo-mechanical finite element model (FEM) was established to evaluate the thermal behavior of the frame under environmental thermal fluctuations. The validity of the presented FEM model was confirmed experimentally by a series of laser interferometer tests. Good agreement between numerical and experimental results demonstrates that the proposed model can accurately predict the thermal deformation of the frame with thermo-mechanical coupling effect. The results also show that keeping the workshop in thermally stable conditions is crucial for improving the machine accuracy of the system with large scale components. The goal of this paper is to investigate the feasibility of innovative mineral casting material applied in high-precision drilling machine and to provide a strategy for machine tool industry seeking a perfect substitute for classic frame materials such as cast iron and granite.Keywords: thermo-mechanical model, finite element method, laser interferometer, mineral casting frame
Procedia PDF Downloads 30321095 Training for Digital Manufacturing: A Multilevel Teaching Model
Authors: Luís Rocha, Adam Gąska, Enrico Savio, Michael Marxer, Christoph Battaglia
Abstract:
The changes observed in the last years in the field of manufacturing and production engineering, popularly known as "Fourth Industry Revolution", utilizes the achievements in the different areas of computer sciences, introducing new solutions at almost every stage of the production process, just to mention such concepts as mass customization, cloud computing, knowledge-based engineering, virtual reality, rapid prototyping, or virtual models of measuring systems. To effectively speed up the production process and make it more flexible, it is necessary to tighten the bonds connecting individual stages of the production process and to raise the awareness and knowledge of employees of individual sectors about the nature and specificity of work in other stages. It is important to discover and develop a suitable education method adapted to the specificities of each stage of the production process, becoming an extremely crucial issue to exploit the potential of the fourth industrial revolution properly. Because of it, the project “Train4Dim” (T4D) intends to develop complex training material for digital manufacturing, including content for design, manufacturing, and quality control, with a focus on coordinate metrology and portable measuring systems. In this paper, the authors present an approach to using an active learning methodology for digital manufacturing. T4D main objective is to develop a multi-degree (apprenticeship up to master’s degree studies) and educational approach that can be adapted to different teaching levels. It’s also described the process of creating the underneath methodology. The paper will share the steps to achieve the aims of the project (training model for digital manufacturing): 1) surveying the stakeholders, 2) Defining the learning aims, 3) producing all contents and curriculum, 4) training for tutors, and 5) Pilot courses test and improvements.Keywords: learning, Industry 4.0, active learning, digital manufacturing
Procedia PDF Downloads 9721094 Broccoli Sprouts Powder Could Improve Metabolic and Liver Disorder-Induced by High-Fructose Corn Syrup
Authors: Zahra Bahadoran, Parvin Mirmiran, Hanieh-Sadat Ejtahed, Maryam Tohidi, Fereidoun Azizi
Abstract:
Background and Aim: Broccoli sprouts, rich source of bioactive compounds specially sulforaphane (SFN), have unique functional properties. This study was conducted to investigate the possible treatment effects of high-SFN broccoli sprouts powder on metabolic and liver disorders in rats fed with high-fructose corn syrup. Methods: Thirty-two male wistar rats, pretreated with an eight-week high-fructose diet (water containing 30% fructose), were randomly allocated into three groups: Baseline control (BC), control (C) (normal diet), and BSP-diet (normal diet+5% BSP). The duration of the study was 6 weeks. Biochemical measurements, liver weight and triglyceride content were evaluated and histopathological examination of liver was performed. Results: After 6-weeks, the liver weight was significantly lower in BSP group compared to controls (13.4 g vs. 11.4 g, P<0.05). After 6 weeks, a significant decrease was observed in fasting serum glucose, total cholesterol and triglyceride levels in both experimental groups (P<0.05). Compared to controls, serum levels of HDL-C were significantly higher in BSP group. The liver TG content in BSP compared to control group was lower (14.6 vs. 16.4 mg/mg tissue). The hepatic levels of alanine aminotransferase, aspartate aminotransferase and γ-glutamyl transferase had not considerable changes in the groups after the intervention period but the level of alkaline phosphatase significantly decreased in BSP group (P<0.05). The histopathological examination of liver confirmed a decrease lobular and portal inflammation and ballooning in BSP group compared to control. Conclusion: High-SFN broccoli sprouts powder has beneficials effect on metabolic and liver changes-induced by high fructose corn syrup.Keywords: broccoli sprouts, metabolic disorders, fatty liver, food science
Procedia PDF Downloads 42421093 Calibration of Hybrid Model and Arbitrage-Free Implied Volatility Surface
Authors: Kun Huang
Abstract:
This paper investigates whether the combination of local and stochastic volatility models can be calibrated exactly to any arbitrage-free implied volatility surface of European option. The risk neutral Brownian Bridge density is applied for calibration of the leverage function of our Hybrid model. Furthermore, the tails of marginal risk neutral density are generated by Generalized Extreme Value distribution in order to capture the properties of asset returns. The local volatility is generated from the arbitrage-free implied volatility surface using stochastic volatility inspired parameterization.Keywords: arbitrage free implied volatility, calibration, extreme value distribution, hybrid model, local volatility, risk-neutral density, stochastic volatility
Procedia PDF Downloads 26721092 A Filtering Algorithm for a Nonlinear State-Space Model
Authors: Abdullah Eqal Al Mazrooei
Abstract:
Kalman filter is a famous algorithm that utilizes to estimate the state in the linear systems. It has numerous applications in technology and science. Since of the most of applications in real life can be described by nonlinear systems. So, Kalman filter does not work with the nonlinear systems because it is suitable to linear systems only. In this work, a nonlinear filtering algorithm is presented which is suitable to use with the special kinds of nonlinear systems. This filter generalizes the Kalman filter. This means that this filter also can be used for the linear systems. Our algorithm depends on a special linearization of the second degree. We introduced the nonlinear algorithm with a bilinear state-space model. A simulation example is presented to illustrate the efficiency of the algorithm.Keywords: Kalman filter, filtering algorithm, nonlinear systems, state-space model
Procedia PDF Downloads 37621091 Optimized and Secured Digital Watermarking Using Fuzzy Entropy, Bezier Curve and Visual Cryptography
Authors: R. Rama Kishore, Sunesh
Abstract:
Recent development in the usage of internet for different purposes creates a great threat for the copyright protection of the digital images. Digital watermarking can be used to address the problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field of secured, robust and imperceptible watermarking. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (2, 2) share visual cryptography and Bezier curve based algorithm to improve the security of the watermark. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method. The algorithm is optimized using fuzzy entropy for better results.Keywords: digital watermarking, fractional transform, visual cryptography, Bezier curve, fuzzy entropy
Procedia PDF Downloads 36621090 When Your Change The Business Model ~ You Change The World
Authors: H. E. Amb. Terry Earthwind Nichols
Abstract:
Over the years Ambassador Nichols observed that successful companies all have one thing in common - belief in people. His observations of people in many companies, industries, and countries have also concluded one thing - groups of achievers far exceed the expectations and timelines of their superiors. His experience with achieving this has brought forth a model for the 21st century that will not only exceed expectations of companies, but it will also set visions for the future of business globally. It is time for real discussion around the future of work and the business model that will set the example for the world. Methodologies: In-person observations over 40 years – Ambassador Nichols present during the observations. Audio-visual observations – TV, Cinema, social media (YouTube, etc.), various news outlet Reading the autobiography of some of successful leaders over the last 75 years that lead their companies from a distinct perspective your people are your commodity. Major findings: People who believe in the leader’s vision for the company so much so that they remain excited about the future of the company and want to do anything in their power to ethically achieve that vision. People who are achieving regularly in groups, division, companies, etcetera: Live more healthfully lowering both sick time off and on-the-job accidents. Cannot wait to physically get to work as much as they can to feed off the high energy present in these companies. They are fully respected and supported resulting in near zero attrition. Simply put – they do not “Burn Out”. Conclusion: To the author’s best knowledge, 20th century practices in business are no longer valid and people are not going to work in those environments any longer. The average worker in the post-covid world is better educated than 50 years ago and most importantly, they have real-time information about any subject and can stream injustices as they happen. The Consortium Model is just the model for the evolution of both humankind and business in the 21st century.Keywords: business model, future of work, people, paradigm shift, business management
Procedia PDF Downloads 7821089 Impact of Green Bonds Issuance on Stock Prices: An Event Study on Respective Indian Companies
Authors: S. L. Tulasi Devi, Shivam Azad
Abstract:
The primary objective of this study is to analyze the impact of green bond issuance on the stock prices of respective Indian companies. An event study methodology has been employed to study the effect of green bond issuance. For in-depth study and analysis, this paper used different window frames, including 15-15 days, 10-10 days, 7-7days, 6-6 days, and 5-5 days. Further, for better clarity, this paper also used an uneven window period of 7-5 days. The period of study covered all the companies which issued green bonds during the period of 2017-2022; Adani Green Energy, State Bank of India, Power Finance Corporation, Jain Irrigation, and Rural Electrification Corporation, except Indian Renewable Energy Development Agency and Indian Railway Finance Corporation, because of data unavailability. The paper used all three event study methods as discussed in earlier literature; 1) constant return model, 2) market-adjusted model, and 3) capital asset pricing model. For the fruitful comparison between results, the study considered cumulative average return (CAR) and buy and hold average return (BHAR) methodology. For checking the statistical significance, a two-tailed t-statistic has been used. All the statistical calculations have been performed in Microsoft Excel 2016. The study found that all other companies have shown positive returns on the event day except for the State Bank of India. The results demonstrated that constant return model outperformed compared to the market-adjusted model and CAPM. The p-value derived from all the methods has shown an almost insignificant impact of the issuance of green bonds on the stock prices of respective companies. The overall analysis states that there’s not much improvement in the market efficiency of the Indian Stock Markets.Keywords: green bonds, event study methodology, constant return model, market-adjusted model, CAPM
Procedia PDF Downloads 9721088 Ophthalmic Hashing Based Supervision of Glaucoma and Corneal Disorders Imposed on Deep Graphical Model
Authors: P. S. Jagadeesh Kumar, Yang Yung, Mingmin Pan, Xianpei Li, Wenli Hu
Abstract:
Glaucoma is impelled by optic nerve mutilation habitually represented as cupping and visual field injury frequently with an arcuate pattern of mid-peripheral loss, subordinate to retinal ganglion cell damage and death. Glaucoma is the second foremost cause of blindness and the chief cause of permanent blindness worldwide. Consequently, all-embracing study into the analysis and empathy of glaucoma is happening to escort deep learning based neural network intrusions to deliberate this substantial optic neuropathy. This paper advances an ophthalmic hashing based supervision of glaucoma and corneal disorders preeminent on deep graphical model. Ophthalmic hashing is a newly proposed method extending the efficacy of visual hash-coding to predict glaucoma corneal disorder matching, which is the faster than the existing methods. Deep graphical model is proficient of learning interior explications of corneal disorders in satisfactory time to solve hard combinatoric incongruities using deep Boltzmann machines.Keywords: corneal disorders, deep Boltzmann machines, deep graphical model, glaucoma, neural networks, ophthalmic hashing
Procedia PDF Downloads 25121087 Cultural Collisions, Ethics and HIV: On Local Values in a Globalized Medical World
Authors: Norbert W. Paul
Abstract:
In 1988, parts of the scientific community still heralded findings to support that AIDS was likely to remain largely a ‘gay disease’. The value-ladden terminology of some of the articles suggested that rectum and fragile urethra are not sufficiently robust to provide a barrier against infectious fluids, especially body fluids contaminated with HIV while the female vagina, would provide natural protection against injuries and trauma facilitating HIV-infection. Anal sexual intercourse was constituted not only as dangerous but also as unnatural practice, while penile-vaginal intercourse would follow natural design and thus be relatively safe practice minimizing the risk of HIV. Statements like the latter were not uncommon in the early times of HIV/AIDS and contributed to captious certainties and an underestimation of heterosexual risks. Pseudo-scientific discourses on the origin of HIV were linked to local and global health politics in the 1980ies. The pathways of infection were related to normative concepts like deviant, subcultural behavior, cultural otherness, and guilt used to target, tag and separate specific groups at risk from the ‘normal’ population. Controlling populations at risk became the top item on the agenda rather than controlling modes of transmission and the virus. Hence, the Thai strategy to cope with HIV/AIDS by acknowledging social and sexual practices as they were – not as they were imagined – has become a role model for successful prevention in the highly scandalized realm of sexually transmitted disease. By accepting the globalized character of local HIV-risk and projecting the risk onto populations which are neither particularly vocal groups nor vested with the means to strive for health and justice Thailand managed to culturally implement knowledge-based tools of prevention. This paper argues, that pertinent cultural collisions regarding our strategies to cope with HIV/AIDS are deeply rooted in misconceptions, misreadings and scandalizations brought about in the early history of HIV in the 1980ties. The Thai strategy is used to demonstrate how local values can be balanced against globalized health risk and used to effectuated prevention by which knowledge and norms are translated into local practices. Issues of global health and injustice will be addressed in the final part of the paper dealing with the achievability of health as a human right.Keywords: bioethics, HIV, global health, justice
Procedia PDF Downloads 26121086 E-Commerce in Jordan: Conceptual Model
Authors: Muneer Abbad
Abstract:
This study comes with a comprehensive analysis of specific factors affecting the adoption of e-commerce in Jordan. From the theoretical perspective, this study will make a contribution to the e-commerce by providing insights on the factors that seem to affect e-commerce’s adoption. The current study will provide managers information about the planning and formulating appropriate strategies to ensure rapid adoption of e-commerce in Jordan. It will offer marketing implications, conclusions, and suggestions for future research.Keywords: e-commerce, Jordan, adoption, conceptual model
Procedia PDF Downloads 45621085 Crack Width Analysis of Reinforced Concrete Members under Shrinkage Effect by Pseudo-Discrete Crack Model
Authors: F. J. Ma, A. K. H. Kwan
Abstract:
Crack caused by shrinkage movement of concrete is a serious problem especially when restraint is provided. It may cause severe serviceability and durability problems. The existing prediction methods for crack width of concrete due to shrinkage movement are mainly numerical methods under simplified circumstances, which do not agree with each other. To get a more unified prediction method applicable to more sophisticated circumstances, finite element crack width analysis for shrinkage effect should be developed. However, no existing finite element analysis can be carried out to predict the crack width of concrete due to shrinkage movement because of unsolved reasons of conventional finite element analysis. In this paper, crack width analysis implemented by finite element analysis is presented with pseudo-discrete crack model, which combines traditional smeared crack model and newly proposed crack queuing algorithm. The proposed pseudo-discrete crack model is capable of simulating separate and single crack without adopting discrete crack element. And the improved finite element analysis can successfully simulate the stress redistribution when concrete is cracked, which is crucial for predicting crack width, crack spacing and crack number.Keywords: crack queuing algorithm, crack width analysis, finite element analysis, shrinkage effect
Procedia PDF Downloads 41921084 Pharmacovigilance: An Empowerment in Safe Utilization of Pharmaceuticals
Authors: Pankaj Prashar, Bimlesh Kumar, Ankita Sood, Anamika Gautam
Abstract:
Pharmacovigilance (PV) is a rapidly growing discipline in pharmaceutical industries as an integral part of clinical research and drug development over the past few decades. PV carries a breadth of scope from drug manufacturing to its regulation with safer utilization. The fundamental steps of PV not only includes data collection and verification, coding of drugs with adverse drug reactions, causality assessment and timely reporting to the authorities but also monitoring drug manufacturing, safety issues, product quality and conduction of due diligence. Standardization of adverse event information, collaboration of multiple departments in different companies, preparation of documents in accordance to both governmental as well as non-governmental organizations (FDA, EMA, GVP, ICH) are the advancements in discipline of PV. De-harmonization, lack of predictive drug safety models, improper funding by government, non-reporting, and non-acceptability of ADRs by developing countries and reports directly from patients to the monitoring centres respectively are the major road backs of PV. Mandatory pharmacovigilance reporting, frequent inspections, funding by government, educating and training medical students, pharmacists and nurses in this segment can bring about empowerment in PV. This area needs to be addressed with a sense of urgency for the safe utilization of pharmaceuticals.Keywords: pharmacovigilance, regulatory, adverse event, drug safety
Procedia PDF Downloads 12421083 Impact of an Onboard Fire for the Evacuation of a Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study highlights the impact of an onboard fire for the evacuation of a rolling stock. Two fires models are achieved. The first one is a zone model realized with the CFAST software. Then, this fire is imported in a building EXODUS model in order to determine the evacuation time with effects of fire effluents (temperature, smoke opacity, smoke toxicity) on passengers. The second fire is achieved with Fire Dynamics Simulator software. The fire defined is directly imported in the FDS+Evac model which will permit to determine the evacuation time and effects of fire effluents on passengers. These effects will be compared with tenability criteria defined in some standards in order to see if the situation is acceptable. Different power of fire will be underlined to see from what power source the hazard become unacceptable.Keywords: fire safety engineering, numerical tools, rolling stock, evacuation
Procedia PDF Downloads 20121082 The Relationship Between Hourly Compensation and Unemployment Rate Using the Panel Data Regression Analysis
Authors: S. K. Ashiquer Rahman
Abstract:
the paper concentrations on the importance of hourly compensation, emphasizing the significance of the unemployment rate. There are the two most important factors of a nation these are its unemployment rate and hourly compensation. These are not merely statistics but they have profound effects on individual, families, and the economy. They are inversely related to one another. When we consider the unemployment rate that will probably decline as hourly compensations in manufacturing rise. But when we reduced the unemployment rates and increased job prospects could result from higher compensation. That’s why, the increased hourly compensation in the manufacturing sector that could have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, we use panel data regression models to evaluate the expected link between hourly compensation and unemployment rate in order to determine the effect of hourly compensation on unemployment rate. We estimate the fixed effects model, evaluate the error components, and determine which model (the FEM or ECM) is better by pooling all 60 observations. We then analysis and review the data by comparing 3 several countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of the extensive research on how the hourly compensation effects on the unemployment rate. Additionally, this paper offers relevant and useful informational to help the government and academic community use an econometrics and social approach to lessen on the effect of the hourly compensation on Unemployment rate to eliminate the problem.Keywords: hourly compensation, Unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model
Procedia PDF Downloads 8121081 Social Enterprise Concept in Sustaining Agro-Industry Development in Indonesia: Case Study of Yourgood Social Business
Authors: Koko Iwan Agus Kurniawan, Dwi Purnomo, Anas Bunyamin, Arif Rahman Jaya
Abstract:
Fruters model is a concept of technopreneurship-based on empowerment, in which technology research results were designed to create high value-added products and implemented as a locomotive of collaborative empowerment; thereby, the impact was widely spread. This model still needs to be inventoried and validated concerning the influenced variables in the business growth process. Model validation accompanied by mapping was required to be applicable to Small Medium Enterprises (SMEs) agro-industry based on sustainable social business and existing real cases. This research explained the empowerment model of Yourgood, an SME, which emphasized on empowering the farmers/ breeders in farmers in rural areas, Cipageran, Cimahi, to housewives in urban areas, Bandung, West Java, Indonesia. This research reviewed some works of literature discussing the agro-industrial development associated with the empowerment and social business process and gained a unique business model picture with the social business platform as well. Through the mapped business model, there were several advantages such as technology acquisition, independence, capital generation, good investment growth, strengthening of collaboration, and improvement of social impacts that can be replicated on other businesses. This research used analytical-descriptive research method consisting of qualitative analysis with design thinking approach and that of quantitative with the AHP (Analytical Hierarchy Process). Based on the results, the development of the enterprise’s process was highly affected by supplying farmers with the score of 0.248 out of 1, being the most valuable for the existence of the enterprise. It was followed by university (0.178), supplying farmers (0.153), business actors (0.128), government (0.100), distributor (0.092), techno-preneurship laboratory (0.069), banking (0.033), and Non-Government Organization (NGO) (0.031).Keywords: agro-industry, small medium enterprises, empowerment, design thinking, AHP, business model canvas, social business
Procedia PDF Downloads 16921080 Endocardial Ultrasound Segmentation using Level Set method
Authors: Daoudi Abdelaziz, Mahmoudi Saïd, Chikh Mohamed Amine
Abstract:
This paper presents a fully automatic segmentation method of the left ventricle at End Systolic (ES) and End Diastolic (ED) in the ultrasound images by means of an implicit deformable model (level set) based on Geodesic Active Contour model. A pre-processing Gaussian smoothing stage is applied to the image, which is essential for a good segmentation. Before the segmentation phase, we locate automatically the area of the left ventricle by using a detection approach based on the Hough Transform method. Consequently, the result obtained is used to automate the initialization of the level set model. This initial curve (zero level set) deforms to search the Endocardial border in the image. On the other hand, quantitative evaluation was performed on a data set composed of 15 subjects with a comparison to ground truth (manual segmentation).Keywords: level set method, transform Hough, Gaussian smoothing, left ventricle, ultrasound images.
Procedia PDF Downloads 46521079 Finite Element Simulation of an Offshore Monopile Subjected to Cyclic Loading Using Hypoplasticity with Intergranular Strain Anisotropy (ISA) for the Soil
Authors: William Fuentes, Melany Gil
Abstract:
Numerical simulations of offshore wind turbines (OWTs) in shallow waters demand sophisticated models considering the cyclic nature of the environmental loads. For the case of an OWT founded on sands, rapid loading may cause a reduction of the effective stress of the soil surrounding the structure. This eventually leads to its settlement, tilting, or other issues affecting its serviceability. In this work, a 3D FE model of an OWT founded on sand is constructed and analyzed. Cyclic loading with different histories is applied at certain points of the tower to simulate some environmental forces. The mechanical behavior of the soil is simulated through the recently proposed ISA-hypoplastic model for sands. The Intergranular Strain Anisotropy ISA can be interpreted as an enhancement of the intergranular strain theory, often used to extend hypoplastic formulations for the simulation of cyclic loading. In contrast to previous formulations, the proposed constitutive model introduces an elastic range for small strain amplitudes, includes the cyclic mobility effect and is able to capture the cyclic behavior of sands under a larger number of cycles. The model performance is carefully evaluated on the FE dynamic analysis of the OWT.Keywords: offshore wind turbine, monopile, ISA, hypoplasticity
Procedia PDF Downloads 24621078 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius
Authors: Mina Adel Shokry Fahim, Jūratė Sužiedelytė Visockienė
Abstract:
With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realisation often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.Keywords: air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter
Procedia PDF Downloads 5321077 Unveiling Special Policy Regime, Judgment, and Taylor Rules in Tunisia
Authors: Yosra Baaziz, Moez Labidi
Abstract:
Given limited research on monetary policy rules in revolutionary countries, this paper challenges the suitability of the Taylor rule in characterizing the monetary policy behavior of the Tunisian Central Bank (BCT), especially in turbulent times. More specifically, we investigate the possibility that the Taylor rule should be formulated as a threshold process and examine the validity of such nonlinear Taylor rule as a robust rule for conducting monetary policy in Tunisia. Using quarterly data from 1998:Q4 to 2013:Q4 to analyze the movement of nominal short-term interest rate of the BCT, we find that the nonlinear Taylor rule improves its performance with the advent of special events providing thus a better description of the Tunisian interest rate setting. In particular, our results show that the adoption of an appropriate nonlinear approach leads to a reduction in the errors of 150 basis points in 1999 and 2009, and 60 basis points in 2011, relative to the linear approach.Keywords: policy rule, central bank, exchange rate, taylor rule, nonlinearity
Procedia PDF Downloads 29621076 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data
Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali
Abstract:
The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors
Procedia PDF Downloads 6921075 Adsorption of Malachite Green Dye on Graphene Oxide Nanosheets from Aqueous Solution: Kinetics and Thermodynamics Studies
Authors: Abeer S. Elsherbiny, Ali H. Gemeay
Abstract:
In this study, graphene oxide (GO) nanosheets have been synthesized and characterized using different spectroscopic tools such as X-ray diffraction spectroscopy, infrared Fourier transform (FT-IR) spectroscopy, BET specific surface area and Transmission Electronic Microscope (TEM). The prepared GO was investigated for the removal of malachite green, a cationic dye from aqueous solution. The removal methods of malachite green has been proceeded via adsorption process. GO nanosheets can be predicted as a good adsorbent material for the adsorption of cationic species. The adsorption of the malachite green onto the GO nanosheets has been carried out at different experimental conditions such as adsorption kinetics, concentration of adsorbate, pH, and temperature. The kinetics of the adsorption data were analyzed using four kinetic models such as the pseudo first-order model, pseudo second-order model, intraparticle diffusion, and the Boyd model to understand the adsorption behavior of malachite green onto the GO nanosheets and the mechanism of adsorption. The adsorption isotherm of adsorption of the malachite green onto the GO nanosheets has been investigated at 25, 35 and 45 °C. The equilibrium data were fitted well to the Langmuir model. Various thermodynamic parameters such as the Gibbs free energy (ΔG°), enthalpy (ΔH°), and entropy (ΔS°) change were also evaluated. The interaction of malachite green onto the GO nanosheets has been investigated by infrared Fourier transform (FT-IR) spectroscopy.Keywords: adsorption, graphene oxide, kinetics, malachite green
Procedia PDF Downloads 41121074 Mango (Mangifera indica L.) Lyophilization Using Vacuum-Induced Freezing
Authors: Natalia A. Salazar, Erika K. Méndez, Catalina Álvarez, Carlos E. Orrego
Abstract:
Lyophilization, also called freeze-drying, is an important dehydration technique mainly used for pharmaceuticals. Food industry also uses lyophilization when it is important to retain most of the nutritional quality, taste, shape and size of dried products and to extend their shelf life. Vacuum-Induced during freezing cycle (VI) has been used in order to control ice nucleation and, consequently, to reduce the time of primary drying cycle of pharmaceuticals preserving quality properties of the final product. This procedure has not been applied in freeze drying of foods. The present work aims to investigate the effect of VI on the lyophilization drying time, final moisture content, density and reconstitutional properties of mango (Mangifera indica L.) slices (MS) and mango pulp-maltodextrin dispersions (MPM) (30% concentration of total solids). Control samples were run at each freezing rate without using induced vacuum. The lyophilization endpoint was the same for all treatments (constant difference between capacitance and Pirani vacuum gauges). From the experimental results it can be concluded that at the high freezing rate (0.4°C/min) reduced the overall process time up to 30% comparing process time required for the control and VI of the lower freeze rate (0.1°C/min) without affecting the quality characteristics of the dried product, which yields a reduction in costs and energy consumption for MS and MPM freeze drying. Controls and samples treated with VI at freezing rate of 0.4°C/min in MS showed similar results in moisture and density parameters. Furthermore, results from MPM dispersion showed favorable values when VI was applied because dried product with low moisture content and low density was obtained at shorter process time compared with the control. There were not found significant differences between reconstitutional properties (rehydration for MS and solubility for MPM) of freeze dried mango resulting from controls, and VI treatments.Keywords: drying time, lyophilization, mango, vacuum induced freezing
Procedia PDF Downloads 41021073 A Sustainable Supplier Selection and Order Allocation Based on Manufacturing Processes and Product Tolerances: A Multi-Criteria Decision Making and Multi-Objective Optimization Approach
Authors: Ravi Patel, Krishna K. Krishnan
Abstract:
In global supply chains, appropriate and sustainable suppliers play a vital role in supply chain development and feasibility. In a larger organization with huge number of suppliers, it is necessary to divide suppliers based on their past history of quality and delivery of each product category. Since performance of any organization widely depends on their suppliers, well evaluated selection criteria and decision-making models lead to improved supplier assessment and development. In this paper, SCOR® performance evaluation approach and ISO standards are used to determine selection criteria for better utilization of supplier assessment by using hybrid model of Analytic Hierchchy Problem (AHP) and Fuzzy Techniques for Order Preference by Similarity to Ideal Solution (FTOPSIS). AHP is used to determine the global weightage of criteria which helps TOPSIS to get supplier score by using triangular fuzzy set theory. Both qualitative and quantitative criteria are taken into consideration for the proposed model. In addition, a multi-product and multi-time period model is selected for order allocation. The optimization model integrates multi-objective integer linear programming (MOILP) for order allocation and a hybrid approach for supplier selection. The proposed MOILP model optimizes order allocation based on manufacturing process and product tolerances as per manufacturer’s requirement for quality product. The integrated model and solution approach are tested to find optimized solutions for different scenario. The detailed analysis shows the superiority of proposed model over other solutions which considered individual decision making models.Keywords: AHP, fuzzy set theory, multi-criteria decision making, multi-objective integer linear programming, TOPSIS
Procedia PDF Downloads 17021072 3D Numerical Investigation of Asphalt Pavements Behaviour Using Infinite Elements
Authors: K. Sandjak, B. Tiliouine
Abstract:
This article presents the main results of three-dimensional (3-D) numerical investigation of asphalt pavement structures behaviour using a coupled Finite Element-Mapped Infinite Element (FE-MIE) model. The validation and numerical performance of this model are assessed by confronting critical pavement responses with Burmister’s solution and FEM simulation results for multi-layered elastic structures. The coupled model is then efficiently utilised to perform 3-D simulations of a typical asphalt pavement structure in order to investigate the impact of two tire configurations (conventional dual and new generation wide-base tires) on critical pavement response parameters. The numerical results obtained show the effectiveness and the accuracy of the coupled (FE-MIE) model. In addition, the simulation results indicate that, compared with conventional dual tire assembly, single wide base tire caused slightly greater fatigue asphalt cracking and subgrade rutting potentials and can thus be utilised in view of its potential to provide numerous mechanical, economic, and environmental benefits.Keywords: 3-D numerical investigation, asphalt pavements, dual and wide base tires, Infinite elements
Procedia PDF Downloads 21521071 Understanding Evidence Dispersal Caused by the Effects of Using Unmanned Aerial Vehicles in Active Indoor Crime Scenes
Authors: Elizabeth Parrott, Harry Pointon, Frederic Bezombes, Heather Panter
Abstract:
Unmanned aerial vehicles (UAV’s) are making a profound effect within policing, forensic and fire service procedures worldwide. These intelligent devices have already proven useful in photographing and recording large-scale outdoor and indoor sites using orthomosaic and three-dimensional (3D) modelling techniques, for the purpose of capturing and recording sites during and post-incident. UAV’s are becoming an established tool as they are extending the reach of the photographer and offering new perspectives without the expense and restrictions of deploying full-scale aircraft. 3D reconstruction quality is directly linked to the resolution of captured images; therefore, close proximity flights are required for more detailed models. As technology advances deployment of UAVs in confined spaces is becoming more common. With this in mind, this study investigates the effects of UAV operation within active crimes scenes with regard to the dispersal of particulate evidence. To date, there has been little consideration given to the potential effects of using UAV’s within active crime scenes aside from a legislation point of view. Although potentially the technology can reduce the likelihood of contamination by replacing some of the roles of investigating practitioners. There is the risk of evidence dispersal caused by the effect of the strong airflow beneath the UAV, from the downwash of the propellers. The initial results of this study are therefore presented to determine the height of least effect at which to fly, and the commercial propeller type to choose to generate the smallest amount of disturbance from the dataset tested. In this study, a range of commercially available 4-inch propellers were chosen as a starting point due to the common availability and their small size makes them well suited for operation within confined spaces. To perform the testing, a rig was configured to support a single motor and propeller powered with a standalone mains power supply and controlled via a microcontroller. This was to mimic a complete throttle cycle and control the device to ensure repeatability. By removing the variances of battery packs and complex UAV structures to allow for a more robust setup. Therefore, the only changing factors were the propeller and operating height. The results were calculated via computer vision analysis of the recorded dispersal of the sample particles placed below the arm-mounted propeller. The aim of this initial study is to give practitioners an insight into the technology to use when operating within confined spaces as well as recognizing some of the issues caused by UAV’s within active crime scenes.Keywords: dispersal, evidence, propeller, UAV
Procedia PDF Downloads 16321070 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm
Procedia PDF Downloads 334