Search results for: supply chain delivery models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11603

Search results for: supply chain delivery models

7793 Making Meaning, Authenticity, and Redefining a Future in Former Refugees and Asylum Seekers Detained in Australia

Authors: Lynne McCormack, Andrew Digges

Abstract:

Since 2013, the Australian government has enforced mandatory detention of anyone arriving in Australia without a valid visa, including those subsequently identified as a refugee or seeking asylum. While consistent with the increased use of immigration detention internationally, Australia’s use of offshore processing facilities both during and subsequent to refugee status determination processing has until recently remained a unique feature of Australia’s program of deterrence. The commonplace detention of refugees and asylum seekers following displacement is a significant and independent source of trauma and a contributory factor in adverse psychological outcomes. Officially, these individuals have no prospect of resettlement in Australia, are barred from applying for substantive visas, and are frequently and indefinitely detained in closed facilities such as immigration detention centres, or alternative places of detention, including hotels. It is also important to note that the limited access to Australia’s immigration detention population made available to researchers often means that data available for secondary analysis may be incomplete or delayed in its release. Further, studies into the lived experience of refugees and asylum seekers are typically cross-sectional and convenience sampled, employing a variety of designs and research methodologies that limit comparability and focused on the immediacy of the individual’s experience. Consequently, how former detainees make sense of their experience, redefine their future trajectory upon release, and recover a sense of authenticity and purpose, is unknown. As such, the present study sought the positive and negative subjective interpretations of 6 participants in Australia regarding their lived experiences as refugees and asylum seekers within Australia’s immigration detention system and its impact on their future sense of self. It made use of interpretative phenomenological analysis (IPA), a qualitative research methodology that is interested in how individuals make sense of, and ascribe meaning to, their unique lived experiences of phenomena. Underpinned by phenomenology, hermeneutics, and critical realism, this idiographic study aimed to explore both positive and negative subjective interpretations of former refugees and asylum seekers held in detention in Australia. It sought to understand how they make sense of their experiences, how detention has impacted their overall journey as displaced persons, and how they have moved forward in the aftermath of protracted detention in Australia. Examining the unique lived experiences of previously detained refugees and asylum seekers may inform the future development of theoretical models of posttraumatic growth among this vulnerable population, thereby informing the delivery of future mental health and resettlement services.

Keywords: mandatory detention, refugee, asylum seeker, authenticity, Interpretative phenomenological analysis

Procedia PDF Downloads 92
7792 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains

Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe

Abstract:

The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.

Keywords: digitalization, digital transformation, Industrie 4.0, lean production, value chain

Procedia PDF Downloads 308
7791 Systematic Formulation Development and Evaluation of Self-Nanoemulsifying Systems of Rosuvastatin Employing QbD Approach and Chemometric Techniques

Authors: Sarwar Beg, Gajanand Sharma, O. P. Katare, Bhupinder Singh

Abstract:

The current studies entail development of self-nano emulsifying drug delivery systems (SNEDDS) of rosuvastatin, employing rational QbD-based approach for enhancing its oral bioavailability. SNEDDS were prepared using the blend of lipidic and emulsifying excipients, i.e., Peceol, Tween 80, and Transcutol HP. The prepared formulations evaluated for in vitro drug release, ex vivo permeation, in situ perfusion studies and in vivo pharmacokinetic studies in rats, which demonstrated 3-4 fold improvement in biopharmaceutical performance of the developed formulations. Cytotoxicity studies using MTT assay and histopathological studies in intestinal cells revealed the lack of cytotoxicity and thereby safety and efficacy of the developed formulations.

Keywords: SNEDDS, bioavailability, solubility, Quality by Design (QbD)

Procedia PDF Downloads 501
7790 An Insight into the Conformational Dynamics of Glycan through Molecular Dynamics Simulation

Authors: K. Veluraja

Abstract:

Glycan of glycolipids and glycoproteins is playing a significant role in living systems particularly in molecular recognition processes. Molecular recognition processes are attributed to their occurrence on the surface of the cell, sequential arrangement and type of sugar molecules present in the oligosaccharide structure and glyosidic linkage diversity (glycoinformatics) and conformational diversity (glycoconformatics). Molecular Dynamics Simulation study is a theoretical-cum-computational tool successfully utilized to establish glycoconformatics of glycan. The study on various oligosaccharides of glycan clearly indicates that oligosaccharides do exist in multiple conformational states and these conformational states arise due to the flexibility associated with a glycosidic torsional angle (φ,ψ) . As an example: a single disaccharide structure NeuNacα(2-3) Gal exists in three different conformational states due to the differences in the preferential value of glycosidic torsional angles (φ,ψ). Hence establishing three dimensional structural and conformational models for glycan (cartesian coordinates of every individual atoms of an oligosaccharide structure in a preferred conformation) is quite crucial to understand various molecular recognition processes such as glycan-toxin interaction and glycan-virus interaction. The gycoconformatics models obtained for various glycan through Molecular Dynamics Simulation stored in our 3DSDSCAR (3DSDSCAR.ORG) a public domain database and its utility value in understanding the molecular recognition processes and in drug design venture will be discussed.

Keywords: glycan, glycoconformatics, molecular dynamics simulation, oligosaccharide

Procedia PDF Downloads 132
7789 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society

Authors: Irene Yi

Abstract:

Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.

Keywords: computational analysis, gendered grammar, misogynistic language, neural networks

Procedia PDF Downloads 115
7788 Continuous Differential Evolution Based Parameter Estimation Framework for Signal Models

Authors: Ammara Mehmood, Aneela Zameer, Muhammad Asif Zahoor Raja, Muhammad Faisal Fateh

Abstract:

In this work, the strength of bio-inspired computational intelligence based technique is exploited for parameter estimation for the periodic signals using Continuous Differential Evolution (CDE) by defining an error function in the mean square sense. Multidimensional and nonlinear nature of the problem emerging in sinusoidal signal models along with noise makes it a challenging optimization task, which is dealt with robustness and effectiveness of CDE to ensure convergence and avoid trapping in local minima. In the proposed scheme of Continuous Differential Evolution based Signal Parameter Estimation (CDESPE), unknown adjustable weights of the signal system identification model are optimized utilizing CDE algorithm. The performance of CDESPE model is validated through statistics based various performance indices on a sufficiently large number of runs in terms of estimation error, mean squared error and Thiel’s inequality coefficient. Efficacy of CDESPE is examined by comparison with the actual parameters of the system, Genetic Algorithm based outcomes and from various deterministic approaches at different signal-to-noise ratio (SNR) levels.

Keywords: parameter estimation, bio-inspired computing, continuous differential evolution (CDE), periodic signals

Procedia PDF Downloads 298
7787 Checking Energy Efficiency by Simulation Tools: The Case of Algerian Ksourian Models

Authors: Khadidja Rahmani, Nahla Bouaziz

Abstract:

Algeria is known for its rich heritage. It owns an immense historical heritage with a universal reputation. Unfortunately, this wealth is withered because of abundance. This research focuses on the Ksourian model, which constitutes a large portion of this wealth. In fact, the Ksourian model is not just a witness to a great part of history or a vernacular culture, but also it includes a panoply of assets in terms of energetic efficiency. In this context, the purpose of our work is to evaluate the performance of the old techniques which are derived from the Ksourian model , and that using the simulation tools. The proposed method is decomposed in two steps; the first consists of isolate and reintroduce each device into a basic model, then run a simulation series on acquired models. And this in order to test the contribution of each of these dialectal processes. In another scale of development, the second step consists of aggregating all these processes in an aboriginal model, then we restart the simulation, to see what it will give this mosaic on the environmental and energetic plan .The model chosen for this study is one of the ksar units of Knadsa city of Bechar (Algeria). This study does not only show the ingenuity of our ancestors in their know-how, and their adapting power to the aridity of the climate, but also proves that their conceptions subscribe in the current concerns of energy efficiency, and respond to the requirements of sustainable development.

Keywords: dialectal processes, energy efficiency, evaluation, Ksourian model, simulation tools

Procedia PDF Downloads 190
7786 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 197
7785 Listening to the Voices of Syrian Refugee Women in Canada: An Ethnographic Insight into the Journey from Trauma to Adaptation

Authors: Areej Al-Hamad, Cheryl Forchuk, Abe Oudshoorn, Gerald Patrick Mckinley

Abstract:

Syrian refugee women face many obstacles when accessing health services in host countries that are influenced by various cultural, structural, and practical factors. This paper is based on critical ethnographic research undertaken in Canada to explore Syrian refugee women's migration experiences. Also, we aim to critically examine how the intersection of gender, trauma, violence and the political and economic conditions of Syrian refugee women shapes their everyday lives and health. The study also investigates the strategies and practices by which Syrian refugee women are currently addressing their healthcare needs and the models of care that are suggested for meeting their physical and mental health needs. Findings show that these women experienced constant worries, hardship, vulnerability, and intrusion of dignity. These experiences and challenges were aggravated by the structure of the Canadian social and health care system. This study offers a better understanding of the impact of migration and trauma on Syrian refugee women's roles, responsibilities, gender dynamics, and interaction with Ontario's healthcare system to improve interaction and outcomes. Health care models should address these challenges among Syrian refugee families in Canada.

Keywords: Syrian refugee women, intersectionality, critical ethnography, migration

Procedia PDF Downloads 87
7784 Predisposition of Small Scale Businesses in Fagge, Kano State, Nigeria, Towards Profit and Loss Sharing Mode of Finance

Authors: Farida, M. Shehu, Shehu U. R. Aliyu

Abstract:

Access to finance has been recognized in the literature as one of the major impediments confronting small scale businesses (SSBs). This largely arises due to high lending rate, religious inclinations, collateral, etc. Islamic mode finance operates under Profit and Loss Sharing (PLS) arrangement between a borrower (business owner) and a lender (Islamic bank). This paper empirically assesses the determinants of predisposition of small scale business operators in Fagge local government area, Kano State, Nigeria, towards the PLS. Cross-sectional data from a sample of 291 small scale business operators was analyzed using logit and probit regression models. Empirical results reveal that while awareness and religion inclination positively drive interest towards the PLS, lending rate and collateral work against it. The paper, therefore, strongly recommends more advocacy campaigns and setting up of more Islamic banks in the country to cater for the financing and religious needs of SSBs in the study area.

Keywords: Islamic finance, logit and probit models, profit and loss sharing small scale businesses, finance, commerce

Procedia PDF Downloads 366
7783 Identification of Switched Reluctance Motor Parameters Using Exponential Swept-Sine Signal

Authors: Abdelmalek Ouannou, Adil Brouri, Laila Kadi, Tarik

Abstract:

Switched reluctance motor (SRM) has a major interest in a large domain as in electric vehicle driving because of its wide range of speed operation, high performances, low cost, and robustness to run under degraded conditions. The purpose of the paper is to develop a new analytical approach for modeling SRM parameters. Then, an identification scheme is proposed to obtain the SRM parameters. Since the SRM is featured by a highly nonlinear behavior, modeling these devices is difficult. Then, it is convenient to develop an accurate model describing the SRM. Furthermore, it is always operated in the magnetically saturated mode to maximize the energy transfer. Accordingly, it is shown that the SRM can be accurately described by a generalized polynomial Hammerstein model, i.e., the parallel connection of several Hammerstein models having polynomial nonlinearity. Presently an analytical identification method is developed using a chirp excitation signal. Afterward, the parameters of the obtained model have been determined using Finite Element Method analysis. Finally, in order to show the effectiveness of the proposed method, a comparison between the true and estimate models has been performed. The obtained results show that the output responses are very close.

Keywords: switched reluctance motor, swept-sine signal, generalized Hammerstein model, nonlinear system

Procedia PDF Downloads 232
7782 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation

Authors: Rizwan Rizwan

Abstract:

This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.

Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats

Procedia PDF Downloads 22
7781 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries

Authors: Tetsuji Tanaka, Jin Guo

Abstract:

The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.

Keywords: food security, GARCH, grain self-sufficiency, volatility transmission

Procedia PDF Downloads 153
7780 Evaluation of UI for 3D Visualization-Based Building Information Applications

Authors: Monisha Pattanaik

Abstract:

In scenarios where users have to work with large amounts of hierarchical data structures combined with visualizations (For example, Construction 3d Models, Manufacturing equipment's models, Gantt charts, Building Plans), the data structures have a high density in terms of consisting multiple parent nodes up to 50 levels and their siblings to descendants, therefore convey an immediate feeling of complexity. With customers moving to consumer-grade enterprise software, it is crucial to make sophisticated features made available to touch devices or smaller screen sizes. This paper evaluates the UI component that allows users to scroll through all deep density levels using a slider overlay on top of the hierarchy table, performing several actions to focus on one set of objects at any point in time. This overlay component also solves the problem of excessive horizontal scrolling of the entire table on a fixed pane for a hierarchical table. This component can be customized to navigate through parents, only siblings, or a specific component of the hierarchy only. The evaluation of the UI component was done by End Users of application and Human-Computer Interaction (HCI) experts to test the UI component's usability with statistical results and recommendations to handle complex hierarchical data visualizations.

Keywords: building information modeling, digital twin, navigation, UI component, user interface, usability, visualization

Procedia PDF Downloads 132
7779 Study of the Influence of Eccentricity Due to Configuration and Materials on Seismic Response of a Typical Building

Authors: A. Latif Karimi, M. K. Shrimali

Abstract:

Seismic design is a critical stage in the process of design and construction of a building. It includes strategies for designing earthquake-resistant buildings to ensure health, safety, and security of the building occupants and assets. Hence, it becomes very important to understand the behavior of structural members precisely, for construction of buildings that can yield a better response to seismic forces. This paper investigates the behavior of a typical structure when subjected to ground motion. The corresponding mode shapes and modal frequencies are studied to interpret the response of an actual structure using different fabricated models and 3D visual models. In this study, three different structural configurations are subjected to horizontal ground motion, and the effect of “stiffness eccentricity” and placement of infill walls are checked to determine how each parameter contributes in a building’s response to dynamic forces. The deformation data from lab experiments and the analysis on SAP2000 software are reviewed to obtain the results. This study revealed that seismic response in a building can be improved by introducing higher deformation capacity in the building. Also, proper design of infill walls and maintaining a symmetrical configuration in a building are the key factors in building stability during the earthquake.

Keywords: eccentricity, seismic response, mode shape, building configuration, building dynamics

Procedia PDF Downloads 195
7778 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 125
7777 Development and Characterization of Topical 5-Fluorouracil Solid Lipid Nanoparticles for the Effective Treatment of Non-Melanoma Skin Cancer

Authors: Sudhir Kumar, V. R. Sinha

Abstract:

Background: The topical and systemic toxicity associated with present nonmelanoma skin cancer (NMSC) treatment therapy using 5-Fluorouracil (5-FU) make it necessary to develop a novel delivery system having lesser toxicity and better control over drug release. Solid lipid nanoparticles offer many advantages like: controlled and localized release of entrapped actives, nontoxicity, and better tolerance. Aim:-To investigate safety and efficacy of 5-FU loaded solid lipid nanoparticles as a topical delivery system for the treatment of nonmelanoma skin cancer. Method: Topical solid lipid nanoparticles of 5-FU were prepared using Compritol 888 ATO (Glyceryl behenate) as lipid component and pluronic F68 (Poloxamer 188), Tween 80 (Polysorbate 80), Tyloxapol (4-(1,1,3,3-Tetramethylbutyl) phenol polymer with formaldehyde and oxirane) as surfactants. The SLNs were prepared with emulsification method. Different formulation parameters viz. type and ratio of surfactant, ratio of lipid and ratio of surfactant:lipid were investigated on particle size and drug entrapment efficiency. Results: Characterization of SLNs like–Transmission Electron Microscopy (TEM), Differential Scannig calorimetry (DSC), Fourier transform infrared spectroscopy (FTIR), Particle size determination, Polydispersity index, Entrapment efficiency, Drug loading, ex vivo skin permeation and skin retention studies, skin irritation and histopathology studies were performed. TEM results showed that shape of SLNs was spherical with size range 200-500nm. Higher encapsulation efficiency was obtained for batches having higher concentration of surfactant and lipid. It was found maximum 64.3% for SLN-6 batch with size of 400.1±9.22 nm and PDI 0.221±0.031. Optimized SLN batches and marketed 5-FU cream were compared for flux across rat skin and skin drug retention. The lesser flux and higher skin retention was obtained for SLN formulation in comparison to topical 5-FU cream, which ensures less systemic toxicity and better control of drug release across skin. Chronic skin irritation studies lacks serious erythema or inflammation and histopathology studies showed no significant change in physiology of epidermal layers of rat skin. So, these studies suggest that the optimized SLN formulation is efficient then marketed cream and safer for long term NMSC treatment regimens. Conclusion: Topical and systemic toxicity associated with long-term use of 5-FU, in the treatment of NMSC, can be minimized with its controlled release with significant drug retention with minimal flux across skin. The study may provide a better alternate for effective NMSC treatment.

Keywords: 5-FU, topical formulation, solid lipid nanoparticles, non melanoma skin cancer

Procedia PDF Downloads 511
7776 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 126
7775 Amino Acid Based Biodegradable Poly (Ester-Amide)s and Their Potential Biomedical Applications as Drug Delivery Containers and Antibacterial

Authors: Nino Kupatadze, Tamar Memanishvili, Natia Ochkhikidze, David Tugushi, Zaal Kokaia, Ramaz Katsarava

Abstract:

Amino acid-based Biodegradable poly(ester-amide)s (PEAs) have gained considerable interest as a promising materials for numerous biomedical applications. These polymers reveal a high biocompatibility and easily form small particles suitable for delivery various biological, as well as elastic bio-erodible films serving as matrices for constructing antibacterial coatings. In the present work we have demonstrated a potential of the PEAs for two applications: 1. cell therapy for stroke as vehicles for delivery and sustained release of growth factors, 2. bactericidal coating as prevention biofilm and applicable in infected wound management. Stroke remains the main cause of adult disability with limited treatment options. Although stem cell therapy is a promising strategy, it still requires improvement of cell survival, differentiation and tissue modulation. .Recently, microspheres (MPs) made of biodegradable polymers have gained significant attention for providing necessary support of transplanted cells. To investigate this strategy in the cell therapy of stroke, MPs loaded with transcription factors Wnt3A/BMP4 were prepared. These proteins have been shown to mediate the maturation of the cortical neurons. We have suggested that implantation of these materials could create a suitable microenvironment for implanted cells. Particles with spherical shape, porous surface, and 5-40 m in size (monitored by scanning electron microscopy) were made on the basis of the original PEA composed of adipic acid, L-phenylalanine and 1,4-butanediol. After 4 months transplantation of MPs in rodent brain, no inflammation was observed. Additionally, factors were successfully released from MPs and affected neuronal cell differentiation in in vitro. The in vivo study using loaded MPs is in progress. Another severe problem in biomedicine is prevention of surgical devices from biofilm formation. Antimicrobial polymeric coatings are most effective “shields” to protect surfaces/devices from biofilm formation. Among matrices for constructing the coatings preference should be given to bio-erodible polymers. Such types of coatings will play a role of “unstable seating” that will not allow bacteria to occupy the surface. In other words, bio-erodible coatings would be discomfort shelter for bacteria that along with releasing “killers of bacteria” should prevent the formation of biofilm. For this purpose, we selected an original biodegradable PEA composed of L-leucine, 1,6-hexanediol and sebacic acid as a bio-erodible matrix, and nanosilver (AgNPs) as a bactericidal agent (“killer of bacteria”). Such nanocomposite material is also promising in treatment of superficial wound and ulcer. The solubility of the PEA in ethanol allows to reduce AgNO3 to NPs directly in the solution, where the solvent served as a reductive agent, and the PEA served as NPs stabilizer. The photochemical reduction was selected as a basic method to form NPs. The obtained AgNPs were characterized by UV-spectroscopy, transmission electron microscope (TEM), and dynamic light scattering (DLS). According to the UV-data and TEM data the photochemical reduction resulted in spherical AgNPs with wide particle size distribution with a high contribution of the particles below 10 nm that are known as responsible for bactericidal activity of AgNPs. DLS study showed that average size of nanoparticles formed after photo-reduction in ethanol solution ranged within ca. 50 nm.

Keywords: biodegradable polymers, microparticles, nanocomposites, stem cell therapy, stroke

Procedia PDF Downloads 392
7774 Large Scale Method to Assess the Seismic Vulnerability of Heritage Buidings: Modal Updating of Numerical Models and Vulnerability Curves

Authors: Claire Limoge Schraen, Philippe Gueguen, Cedric Giry, Cedric Desprez, Frédéric Ragueneau

Abstract:

Mediterranean area is characterized by numerous monumental or vernacular masonry structures illustrating old ways of build and live. Those precious buildings are often poorly documented, present complex shapes and loadings, and are protected by the States, leading to legal constraints. This area also presents a moderate to high seismic activity. Even moderate earthquakes can be magnified by local site effects and cause collapse or significant damage. Moreover the structural resistance of masonry buildings, especially when less famous or located in rural zones has been generally lowered by many factors: poor maintenance, unsuitable restoration, ambient pollution, previous earthquakes. Recent earthquakes prove that any damage to these architectural witnesses to our past is irreversible, leading to the necessity of acting preventively. This means providing preventive assessments for hundreds of structures with no or few documents. In this context we want to propose a general method, based on hierarchized numerical models, to provide preliminary structural diagnoses at a regional scale, indicating whether more precise investigations and models are necessary for each building. To this aim, we adapt different tools, being developed such as photogrammetry or to be created such as a preprocessor starting from pictures to build meshes for a FEM software, in order to allow dynamic studies of the buildings of the panel. We made an inventory of 198 baroque chapels and churches situated in the French Alps. Then their structural characteristics have been determined thanks field surveys and the MicMac photogrammetric software. Using structural criteria, we determined eight types of churches and seven types for chapels. We studied their dynamical behavior thanks to CAST3M, using EC8 spectrum and accelerogramms of the studied zone. This allowed us quantifying the effect of the needed simplifications in the most sensitive zones and choosing the most effective ones. We also proposed threshold criteria based on the observed damages visible in the in situ surveys, old pictures and Italian code. They are relevant in linear models. To validate the structural types, we made a vibratory measures campaign using vibratory ambient noise and velocimeters. It also allowed us validating this method on old masonry and identifying the modal characteristics of 20 churches. Then we proceeded to a dynamic identification between numerical and experimental modes. So we updated the linear models thanks to material and geometrical parameters, often unknown because of the complexity of the structures and materials. The numerically optimized values have been verified thanks to the measures we made on the masonry components in situ and in laboratory. We are now working on non-linear models redistributing the strains. So we validate the damage threshold criteria which we use to compute the vulnerability curves of each defined structural type. Our actual results show a good correlation between experimental and numerical data, validating the final modeling simplifications and the global method. We now plan to use non-linear analysis in the critical zones in order to test reinforcement solutions.

Keywords: heritage structures, masonry numerical modeling, seismic vulnerability assessment, vibratory measure

Procedia PDF Downloads 489
7773 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 203
7772 Improved Elastoplastic Bounding Surface Model for the Mathematical Modeling of Geomaterials

Authors: Andres Nieto-Leal, Victor N. Kaliakin, Tania P. Molina

Abstract:

The nature of most engineering materials is quite complex. It is, therefore, difficult to devise a general mathematical model that will cover all possible ranges and types of excitation and behavior of a given material. As a result, the development of mathematical models is based upon simplifying assumptions regarding material behavior. Such simplifications result in some material idealization; for example, one of the simplest material idealization is to assume that the material behavior obeys the elasticity. However, soils are nonhomogeneous, anisotropic, path-dependent materials that exhibit nonlinear stress-strain relationships, changes in volume under shear, dilatancy, as well as time-, rate- and temperature-dependent behavior. Over the years, many constitutive models, possessing different levels of sophistication, have been developed to simulate the behavior geomaterials, particularly cohesive soils. Early in the development of constitutive models, it became evident that elastic or standard elastoplastic formulations, employing purely isotropic hardening and predicated in the existence of a yield surface surrounding a purely elastic domain, were incapable of realistically simulating the behavior of geomaterials. Accordingly, more sophisticated constitutive models have been developed; for example, the bounding surface elastoplasticity. The essence of the bounding surface concept is the hypothesis that plastic deformations can occur for stress states either within or on the bounding surface. Thus, unlike classical yield surface elastoplasticity, the plastic states are not restricted only to those lying on a surface. Elastoplastic bounding surface models have been improved; however, there is still need to improve their capabilities in simulating the response of anisotropically consolidated cohesive soils, especially the response in extension tests. Thus, in this work an improved constitutive model that can more accurately predict diverse stress-strain phenomena exhibited by cohesive soils was developed. Particularly, an improved rotational hardening rule that better simulate the response of cohesive soils in extension. The generalized definition of the bounding surface model provides a convenient and elegant framework for unifying various previous versions of the model for anisotropically consolidated cohesive soils. The Generalized Bounding Surface Model for cohesive soils is a fully three-dimensional, time-dependent model that accounts for both inherent and stress induced anisotropy employing a non-associative flow rule. The model numerical implementation in a computer code followed an adaptive multistep integration scheme in conjunction with local iteration and radial return. The one-step trapezoidal rule was used to get the stiffness matrix that defines the relationship between the stress increment and the strain increment. After testing the model in simulating the response of cohesive soils through extensive comparisons of model simulations to experimental data, it has been shown to give quite good simulations. The new model successfully simulates the response of different cohesive soils; for example, Cardiff Kaolin, Spestone Kaolin, and Lower Cromer Till. The simulated undrained stress paths, stress-strain response, and excess pore pressures are in very good agreement with the experimental values, especially in extension.

Keywords: bounding surface elastoplasticity, cohesive soils, constitutive model, modeling of geomaterials

Procedia PDF Downloads 313
7771 The Impacts of New Digital Technology Transformation on Singapore Healthcare Sector: Case Study of a Public Hospital in Singapore from a Management Accounting Perspective

Authors: Junqi Zou

Abstract:

As one of the world’s most tech-ready countries, Singapore has initiated the Smart Nation plan to harness the full power and potential of digital technologies to transform the way people live and work, through the more efficient government and business processes, to make the economy more productive. The key evolutions of digital technology transformation in healthcare and the increasing deployment of Internet of Things (IoTs), Big Data, AI/cognitive, Robotic Process Automation (RPA), Electronic Health Record Systems (EHR), Electronic Medical Record Systems (EMR), Warehouse Management System (WMS in the most recent decade have significantly stepped up the move towards an information-driven healthcare ecosystem. The advances in information technology not only bring benefits to patients but also act as a key force in changing management accounting in healthcare sector. The aim of this study is to investigate the impacts of digital technology transformation on Singapore’s healthcare sector from a management accounting perspective. Adopting a Balanced Scorecard (BSC) analysis approach, this paper conducted an exploratory case study of a newly launched Singapore public hospital, which has been recognized as amongst the most digitally advanced healthcare facilities in Asia-Pacific region. Specifically, this study gains insights on how the new technology is changing healthcare organizations’ management accounting from four perspectives under the Balanced Scorecard approach, 1) Financial Perspective, 2) Customer (Patient) Perspective, 3) Internal Processes Perspective, and 4) Learning and Growth Perspective. Based on a thorough review of archival records from the government and public, and the interview reports with the hospital’s CIO, this study finds the improvements from all the four perspectives under the Balanced Scorecard framework as follows: 1) Learning and Growth Perspective: The Government (Ministry of Health) works with the hospital to open up multiple training pathways to health professionals that upgrade and develops new IT skills among the healthcare workforce to support the transformation of healthcare services. 2) Internal Process Perspective: The hospital achieved digital transformation through Project OneCare to integrate clinical, operational, and administrative information systems (e.g., EHR, EMR, WMS, EPIB, RTLS) that enable the seamless flow of data and the implementation of JIT system to help the hospital operate more effectively and efficiently. 3) Customer Perspective: The fully integrated EMR suite enhances the patient’s experiences by achieving the 5 Rights (Right Patient, Right Data, Right Device, Right Entry and Right Time). 4) Financial Perspective: Cost savings are achieved from improved inventory management and effective supply chain management. The use of process automation also results in a reduction of manpower costs and logistics cost. To summarize, these improvements identified under the Balanced Scorecard framework confirm the success of utilizing the integration of advanced ICT to enhance healthcare organization’s customer service, productivity efficiency, and cost savings. Moreover, the Big Data generated from this integrated EMR system can be particularly useful in aiding management control system to optimize decision making and strategic planning. To conclude, the new digital technology transformation has moved the usefulness of management accounting to both financial and non-financial dimensions with new heights in the area of healthcare management.

Keywords: balanced scorecard, digital technology transformation, healthcare ecosystem, integrated information system

Procedia PDF Downloads 157
7770 Delimitation of the Perimeters of PR Otection of the Wellfield in the City of Adrar, Sahara of Algeria through the Used Wyssling’s Method

Authors: Ferhati Ahmed, Fillali Ahmed, Oulhadj Younsi

Abstract:

delimitation of the perimeters of protection in the catchment area of the city of Adrar, which are established around the sites for the collection of water intended for human consumption of drinking water, with the objective of ensuring the preservation and reducing the risks of point and accidental pollution of the resource (Continental Intercalar groundwater of the Northern Sahara of Algeria). This wellfield is located in the northeast of the city of Adrar, it covers an area of 132.56 km2 with 21 Drinking Water Supply wells (DWS), pumping a total flow of approximately 13 Hm3/year. The choice of this wellfield is based on the favorable hydrodynamic characteristics and their location in relation to the agglomeration. The vulnerability to pollution of this slick is very high because the slick is free and suffers from the absence of a protective layer. In recent years, several factors have been introduced around the field that can affect the quality of this precious resource, including the presence of a strong centre for domestic waste and agricultural and industrial activities. Thus, its sustainability requires the implementation of protection perimeters. The objective of this study is to set up three protection perimeters: immediate, close and remote. The application of the Wyssling method makes it possible to calculate the transfer time (t) of a drop of groundwater located at any point in the aquifer up to the abstraction and thus to define isochrones which in turn delimit each type of perimeter, 40 days for the nearer and 100 days for the farther away. Special restrictions are imposed for all activities depending on the distance of the catchment. The application of this method to the Adrar city catchment field showed that the close and remote protection perimeters successively occupy areas of 51.14 km2 and 92.9 km2. Perimeters are delimited by geolocated markers, 40 and 46 markers successively. These results show that the areas defined as "near protection perimeter" are free from activities likely to present a risk to the quality of the water used. On the other hand, on the areas defined as "remote protection perimeter," there is some agricultural and industrial activities that may present an imminent risk. A rigorous control of these activities and the restriction of the type of products applied in industrial and agricultural is imperative.

Keywords: continental intercalaire, drinking water supply, groundwater, perimeter of protection, wyssling method

Procedia PDF Downloads 92
7769 Characterising Stable Model by Extended Labelled Dependency Graph

Authors: Asraful Islam

Abstract:

Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.

Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring

Procedia PDF Downloads 207
7768 Studying the Impact of Soil Characteristics in Displacement of Retaining Walls Using Finite Element

Authors: Mojtaba Ahmadabadi, Akbar Masoudi, Morteza Rezai

Abstract:

In this paper, using the finite element method, the effect of soil and wall characteristics was investigated. Thirty and two different models were studied by different parameters. These studies could calculate displacement at any height of the wall for frictional-cohesive soils. The main purpose of this research is to determine the most effective soil characteristics in reducing the wall displacement. Comparing different models showed that the overall increase in internal friction angle, angle of friction between soil and wall and modulus of elasticity reduce the replacement of the wall. In addition, increase in special weight of soil will increase the wall displacement. Based on results, it can be said that all wall displacements were overturning and in the backfill, soil was bulging. Results show that the highest impact is seen in reducing wall displacement, internal friction angle, and the angle friction between soil and wall. One of the advantages of this study is taking into account all the parameters of the soil and walls replacement distribution in wall and backfill soil. In this paper, using the finite element method and considering all parameters of the soil, we investigated the impact of soil parameter in wall displacement. The aim of this study is to provide the best conditions in reducing the wall displacement and displacement wall and soil distribution.

Keywords: retaining wall, fem, soil and wall interaction, angle of internal friction of the soil, wall displacement

Procedia PDF Downloads 386
7767 A Reactive Flexible Job Shop Scheduling Model in a Stochastic Environment

Authors: Majid Khalili, Hamed Tayebi

Abstract:

This paper considers a stochastic flexible job-shop scheduling (SFJSS) problem in the presence of production disruptions, and reactive scheduling is implemented in order to find the optimal solution under uncertainty. In this problem, there are two main disruptions including machine failure which influences operation time, and modification or cancellation of the order delivery date during production. In order to decrease the negative effects of these difficulties, two derived strategies from reactive scheduling are used; the first one is relevant to being able to allocate multiple machine to each job, and the other one is related to being able to select the best alternative process from other job while some disruptions would be created in the processes of a job. For this purpose, a Mixed Integer Linear Programming model is proposed.

Keywords: flexible job-shop scheduling, reactive scheduling, stochastic environment, mixed integer linear programming

Procedia PDF Downloads 354
7766 Electrotechnology for Silicon Refining: Plasma Generator and Arc Furnace Installations and Theoretical Base

Authors: Ashot Navasardian, Mariam Vardanian, Vladik Vardanian

Abstract:

The photovoltaic and the semiconductor industries are in growth and it is necessary to supply a large amount of silicon to maintain this growth. Since silicon is still the best material for the manufacturing of solar cells and semiconductor components so the pure silicon like solar grade and semiconductor grade materials are demanded. There are two main routes for silicon production: metallurgical and chemical. In this article, we reviewed the electrotecnological installations and systems for semiconductor manufacturing. The main task is to design the installation which can produce SOG Silicon from river sand by one work unit.

Keywords: metallurgical grade silicon, solar grade silicon, impurity, refining, plasma

Procedia PDF Downloads 489
7765 Predictive Analysis of the Stock Price Market Trends with Deep Learning

Authors: Suraj Mehrotra

Abstract:

The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.

Keywords: machine learning, testing set, artificial intelligence, stock analysis

Procedia PDF Downloads 91
7764 Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.

Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure

Procedia PDF Downloads 103