Search results for: autoregressive moving average model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20727

Search results for: autoregressive moving average model

8007 Markov-Chain-Based Optimal Filtering and Smoothing

Authors: Garry A. Einicke, Langford B. White

Abstract:

This paper describes an optimum filter and smoother for recovering a Markov process message from noisy measurements. The developments follow from an equivalence between a state space model and a hidden Markov chain. The ensuing filter and smoother employ transition probability matrices and approximate probability distribution vectors. The properties of the optimum solutions are retained, namely, the estimates are unbiased and minimize the variance of the output estimation error, provided that the assumed parameter set are correct. Methods for estimating unknown parameters from noisy measurements are discussed. Signal recovery examples are described in which performance benefits are demonstrated at an increased calculation cost.

Keywords: optimal filtering, smoothing, Markov chains

Procedia PDF Downloads 301
8006 Active Development of Tacit Knowledge Using Social Media and Learning Communities

Authors: John Zanetich

Abstract:

This paper uses a pragmatic research approach to investigate the relationships between Active Development of Tacit Knowledge (ADTK), social media (Facebook) and classroom learning communities. This paper investigates the use of learning communities and social media as the context and means for changing tacit knowledge to explicit and presents a dynamic model of the development of a classroom learning community. The goal of this study is to identify the point that explicit knowledge is converted to tacit knowledge and to test a way to quantify the exchange using social media and learning communities.

Keywords: tacit knowledge, knowledge management, college programs, experiential learning, learning communities

Procedia PDF Downloads 342
8005 Knowledge Management: Why is So Difficult? From “A Good Idea” to Organizational Contribute

Authors: Lisandro Blas, Héctor Tamanini

Abstract:

From earliest 90 to now, no many companies or organization can “really” implement a knowledge management (KM) system that works (no only viewed from a measurement model, but in this continuity). Which are the reasons of that? Some of the reason maybe could be embedded in how KM is demanded (usefulness, priority, experts, a definition of KM) vs the importance and resources that the organizations afford (budget, responsible of a specific area of KM, intangibility). Many organizations “claim” the importance of Knowledge Management but thhese demands are not reflecting these claims in their future actions. With another’s tools or managements ideas the organizations put the economics and human resources to work. Why it´s not occur in KM? This paper tray to explain some of this reasons and tray to deal with this situations through a survey done in 2011 for a IAPG (Argentinean Institute from Oil & Gas) Congress.

Keywords: knowledge management into organizations, new perspectives, failure in implementation, claim

Procedia PDF Downloads 397
8004 Clusterization Probability in 14N Nuclei

Authors: N. Burtebayev, Sh. Hamada, Zh. Kerimkulov, D. K. Alimov, A. V. Yushkov, N. Amangeldi, A. N. Bakhtibaev

Abstract:

The main aim of the current work is to examine if 14N is candidate to be clusterized nuclei or not. In order to check this attendance, we have measured the angular distributions for 14N ion beam elastically scattered on 12C target nuclei at different low energies; 17.5, 21, and 24.5MeV which are close to the Coulomb barrier energy for 14N+12C nuclear system. Study of various transfer reactions could provide us with useful information about the attendance of nuclei to be in a composite form (core + valence). The experimental data were analyzed using two approaches; Phenomenological (Optical Potential) and semi-microscopic (Double Folding Potential). The agreement between the experimental data and the theoretical predictions is fairly good in the whole angular range.

Keywords: deuteron transfer, elastic scattering, optical model, double folding, density distribution

Procedia PDF Downloads 309
8003 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 309
8002 Identification and Selection of a Supply Chain Target Process for Re-Design

Authors: Jaime A. Palma-Mendoza

Abstract:

A supply chain consists of different processes and when conducting supply chain re-design is necessary to identify the relevant processes and select a target for re-design. A solution was developed which consists to identify first the relevant processes using the Supply Chain Operations Reference (SCOR) model, then to use Analytical Hierarchy Process (AHP) for target process selection. An application was conducted in an Airline MRO supply chain re-design project which shows this combination can clearly aid the identification of relevant supply chain processes and the selection of a target process for re-design.

Keywords: decision support systems, multiple criteria analysis, supply chain management

Procedia PDF Downloads 471
8001 Profiling Risky Code Using Machine Learning

Authors: Zunaira Zaman, David Bohannon

Abstract:

This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.

Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties

Procedia PDF Downloads 83
8000 Effects of a Brisk-Walking Program on Anxiety, Depression and Self-Concept in Adolescents: A Time-Series Design

Authors: Ming Yi Hsu, Hui Jung Chao

Abstract:

The anxiety and depression adolescents in Taiwan experience can cause suicide attempts and result in unfortunate deaths. An effective method for relieving anxiety and depression is brisk walking; a moderate and low intensity aerobic exercise, which uses large muscle groups rhythmically. The research purpose was to investigate the effects of a 12-week, school-based, brisk-walking program in decreasing anxiety and depression, and in improving self-concept among high school students living in central Taiwan. A quasi-experiment using the time series design (T1 T2 X T3 T4) was conducted. The Beck Youth Inventories 2 (BYI-II) Chinese version was given four times: the first time T1 was in the 4th week prior to intervention, T2 was in the intervention week, T3 was in the 6th week after the start of the intervention period and T4 was in the 12th week post intervention. The baseline phase of the time series constituted T1 and T2. The intervention phase constituted T2, T3, and T4. The amounts of brisk walking were recorded by self-report The Generalized Estimating Equation (GEE) was used to examine the effects of brisk walking on anxiety, depression, and self-concept. The independent t-test was used to compare mean scores on three dependent variables between brisk walking over and less than 90-minutes per week. Findings revealed that levels of anxiety and self-concept had nonsignificant change during the baseline phase, while the level of depression increased significantly. In contrast, the study demonstrated significant decreases in anxiety and depression as well as increases in positive self-concept (p=.001, p<.001, p=.017) during the intervention phase. Furthermore, a subgroup analysis was completed on participants who demonstrated elevated anxiety (23.4%), and depression (29.7%), and below average self-concept (18.6%) at baseline (T2). The subgroup of anxious, depressed, or low self-concept participants who received the brisk-walking intervention demonstrated significant decreases in anxiety and depression, and significant increases in self-concept scores. Participants who engaged in brisk walking over 90 minutes per week reported decreased mean scores on anxiety (t=-2.395, p=.035) and depression (t=-2.142, p=.036) in contrast with those who engaged in brisk-walking time less than 90 minutes per week. Regarding the effects on participants whose anxiety, scores were within the normal range at baseline, there was demonstrated significant decrease in the level of anxiety when they increased their time on brisk walking before each term examination. Overall, the brisk-walking program was effective and feasible to promote adolescents’ mental health by decreasing anxiety and depression as well as elevating self-concept. It also helped adolescents from anxiety before term examinations.

Keywords: adolescents, anxiety, depression, self-concept

Procedia PDF Downloads 175
7999 Realization of a (GIS) for Drilling (DWS) through the Adrar Region

Authors: Djelloul Benatiallah, Ali Benatiallah, Abdelkader Harouz

Abstract:

Geographic Information Systems (GIS) include various methods and computer techniques to model, capture digitally, store, manage, view and analyze. Geographic information systems have the characteristic to appeal to many scientific and technical field, and many methods. In this article we will present a complete and operational geographic information system, following the theoretical principles of data management and adapting to spatial data, especially data concerning the monitoring of drinking water supply wells (DWS) Adrar region. The expected results of this system are firstly an offer consulting standard features, updating and editing beneficiaries and geographical data, on the other hand, provides specific functionality contractors entered data, calculations parameterized and statistics.

Keywords: GIS, DWS, drilling, Adrar

Procedia PDF Downloads 287
7998 An Experimental Investigation of Chemical Enhanced Oil Recovery (Ceor) for Fractured Carbonate Reservoirs, Case Study: Kais Formation on Wakamuk Field

Authors: Jackson Andreas Theo Pola, Leksono Mucharam, Hari Oetomo, Budi Susanto, Wisnu Nugraha

Abstract:

About half of the world oil reserves are located in carbonate reservoirs, where 65% of the total carbonate reservoirs are oil wet and 12% intermediate wet [1]. Oil recovery in oil wet or mixed wet carbonate reservoirs can be increased by dissolving surfactant to injected water to change the rock wettability from oil wet to more water wet. The Wakamuk Field operated by PetroChina International (Bermuda) Ltd. and PT. Pertamina EP in Papua, produces from main reservoir of Miocene Kais Limestone. First production commenced on August, 2004 and the peak field production of 1456 BOPD occurred in August, 2010. It was found that is a complex reservoir system and until 2014 cumulative oil production was 2.07 MMBO, less than 9% of OOIP. This performance is indicative of presence of secondary porosity, other than matrix porosity which is of low average porosity 13% and permeability less than 7 mD. Implementing chemical EOR in this case is the best way to increase oil production. However, the selected chemical must be able to lower the interfacial tension (IFT), reduce oil viscosity, and alter the wettability; thus a special chemical treatment named SeMAR has been proposed. Numerous laboratory tests such as phase behavior test, core compatibility test, mixture viscosity, contact angle measurement, IFT, imbibitions test and core flooding were conducted on Wakamuk field samples. Based on the spontaneous imbibitions results for Wakamuk field core, formulation of SeMAR with compositional S12A gave oil recovery 43.94% at 1wt% concentration and maximum percentage of oil recovery 87.3% at 3wt% concentration respectively. In addition, the results for first scenario of core flooding test gave oil recovery 60.32% at 1 wt% concentration S12A and the second scenario gave 96.78% of oil recovery at concentration 3 wt% respectively. The soaking time of chemicals has a significant effect on the recovery and higher chemical concentrations affect larger areas for wettability and therefore, higher oil recovery. The chemical that gives best overall results from laboratory tests study will also be a consideration for Huff and Puff injections trial (pilot project) for increasing oil recovery from Wakamuk Field

Keywords: Wakamuk field, chemical treatment, oil recovery, viscosity

Procedia PDF Downloads 671
7997 China’s Hotel m-Bookers’ Perceptions of their Booking Experiences

Authors: Weiqi Xia

Abstract:

We assess the perceptions of China’s hotel m-bookers using the E-SERVQUAL model and technology affordance assessment metrics. The data analysis provides insight into Chinese hotel m-bookers’ perceptions of information quality items, system quality items, and functional quality items. Respondents’ perceived value of such items is greatly enhanced via mini-program support and self-service innovation, which are predicted to be of increasing importance in the future. The findings of this study help close the gap between hotel operators’ understanding and customers’ perceptions. Our findings may also provide valuable insights into the functioning of China’s hotel industry.

Keywords: mobile hotel booking, hotel m-bookers, user perception, China’s WeChat mini program, hotel booking apps.

Procedia PDF Downloads 16
7996 Non-Cytotoxic Natural Sourced Inorganic Hydroxyapatite (HAp) Scaffold Facilitate Bone-like Mechanical Support and Cell Proliferation

Authors: Sudip Mondal, Biswanath Mondal, Sudit S. Mukhopadhyay, Apurba Dey

Abstract:

Bioactive materials improve devices for a long lifespan but have mechanical limitations. Mechanical characterization is one of the very important characteristics to evaluate the life span and functionality of the scaffold material. After implantation of scaffold material the primary stage rejection of scaffold occurs due to non biocompatible effect of host body system. The second major problems occur due to the effect of mechanical failure. The mechanical and biocompatibility failure of the scaffold materials can be overcome by the prior evaluation of the scaffold materials. In this study chemically treated Labeo rohita scale is used for synthesizing hydroxyapatite (HAp) biomaterial. Thermo-gravimetric and differential thermal analysis (TG-DTA) is carried out to ensure thermal stability. The chemical composition and bond structures of wet ball-milled calcined HAp powder is characterized by Fourier Transform Infrared spectroscopy (FTIR), X-ray Diffraction (XRD), Field Emission Scanning Electron Microscopy (FE-SEM), Transmission Electron Microscopy (TEM), Energy Dispersive X-ray (EDX) analysis. Fish scale derived apatite materials consists of nano-sized particles with Ca/P ratio of 1.71. The biocompatibility through cytotoxicity evaluation and MTT assay are carried out in MG63 osteoblast cell lines. In the cell attachment study, the cells are tightly attached with HAp scaffolds developed in the laboratory. The result clearly suggests that HAp material synthesized in this study do not have any cytotoxic effect, as well as it has a natural binding affinity for mammalian cell lines. The synthesized HAp powder further successfully used to develop porous scaffold material with suitable mechanical property of ~0.8GPa compressive stress, ~1.10 GPa a hardness and ~ 30-35% porosity which is acceptable for implantation in trauma region for animal model. The histological analysis also supports the bio-affinity of processed HAp biomaterials in Wistar rat model for investigating the contact reaction and stability at the artificial or natural prosthesis interface for biomedical function. This study suggests the natural sourced fish scale-derived HAp material could be used as a suitable alternative biomaterial for tissue engineering application in near future.

Keywords: biomaterials, hydroxyapatite, scaffold, mechanical property, tissue engineering

Procedia PDF Downloads 438
7995 Valorizing Traditional Greek Wheat Varieties: Use of DNA Barcoding for Species Identification and Biochemical Analysis of Their Nutritional Value

Authors: Niki Mougiou, Spyros Didos, Ioanna Bouzouka, Athina Theodorakopoulou, Michael Kornaros, Anagnostis Argiriou

Abstract:

Grains from traditional old Greek cereal varieties were evaluated and compared to commercial cultivars, like Simeto and Mexicali 81, in an effort to valorize local products and assess the nutritional benefits of ancient grains. The samples studied in this research included common wheat, durum wheat, emmer (Triticum dicoccum) and einkorn (Triticum monococcum), as well as barley, oats and rye grains. The Internal Transcribed Spacer 2 (ITS2) nuclear region was amplified and sequenced as a barcode for species identification, allowing the verification of the label of each product. After that, the total content of bound and free polyphenols and flavonoids, as well as the antioxidant activity of bound and free compounds, was measured by classic colorimetric assays using Folin- Ciocalteu, AlCl₃ and DPPH‧ (2,2-diphenyl-1-picrylhydrazyl) reagents, respectively. Moreover, the level of variation of fatty acids was determined in all samples by gas chromatography. The results showed that local old landraces of emmer and einkorn had the highest polyphenol content, 2.4 and 3.3 times higher than the average value of 5 durum wheat samples, respectively. Regarding the total flavonoid content, einkorn had 2.6-fold and emmer 2-fold higher values than common wheat. The antioxidant activity of free or bound compounds was at the same level, at about 20-30% higher in both einkorn and emmer compared to common wheat. Five main fatty acids were detected in all samples, in order of decreasing amounts: linoleic (C18:2) > palmitic (C16:0) ≈ , oleic (C18:1) > eicosenoic (C20:1, cis-11) > stearic (C18:0). Emmer and einkorn showed a higher diversity of fatty acids and a higher content of mono-unsaturated fatty acids compared to common wheat. The results of this study demonstrate the high nutritional value of old local landraces that have been put aside by more productive, yet with lower qualitative characteristics, commercial cultivars, underlining the importance of maintaining sustainable agricultural practices to ensure their continued cultivation.

Keywords: biochemical analysis, nutritional value, plant barcoding, wheat

Procedia PDF Downloads 68
7994 Barriers and Facilitators to Physical Activity Among Older Adults Living in Long‐Term Care Facilities: A Systematic Review with Qualitative Evidence Synthesis

Authors: Ying Shi, June Zhang, Lu Shao, Xiyan Xie, Aidi Lao, Zhangan Wang

Abstract:

Background: Low levels of physical activity are associated with poorer health outcomes, and this situation is more critical in older adults living in long‐term care facilities. Objectives: To systematically identify, appraise, and synthesize current qualitative research evidence regarding the barriers and facilitators to physical activity as reported by older adults and care staff in long‐term care facilities. Design: This is a systematic review with qualitative evidence synthesis adhering to PRISMA guidelines. Methods: We conducted a systematic search on PubMed, Science Citation Index Expanded, Social Sciences Citation Index, EMBASE, CINAHL, and PsychInfo databases from inception until 30 June 2023. Thematic synthesis was undertaken to identify the barriers and facilitators relating to physical activity. Then, we mapped them onto the Capability, Opportunity, Motivation, and Behavior model and Theoretical Domains Framework. Methodological quality was assessed using the CASP Qualitative Studies Checklist, and confidence in review findings was assessed using the GRADE-CERQual approach. Results: We included 32 studies after screening 10496 citations and 177 full texts. Seven themes and 17 subthemes were identified relating to barriers and facilitators influencing physical activity in elderly residents. The main themes were mapped onto COM-B) model-Capability (physical activity knowledge gaps and individual health issues), Opportunity (social support and macro-level resources) and Motivation (health beliefs, fear of falling or injury, and personal and social incentives to physical activity). Most subthemes were graded as high (n = 9) or moderate (n = 3) confidence. Conclusions and Implications: Our comprehensive synthesis of 32 studies provides a wealth of knowledge of barriers and facilitators to physical activity from both residents and care staff’s perspectives. Intervention components were also suggested within the context of long‐term care facilities. End users such as older residents, care staff, and researchers can have confidence in our findings when formulating policies and guidance on promoting physical activity among elderly residents in long‐term care facilities.

Keywords: long‐term care, older adults, physical activity, qualitative, systematic review

Procedia PDF Downloads 60
7993 The Relation Between Protein-Protein and Polysaccharide-Protein Interaction on Aroma Release from Brined Cheese Model

Authors: Mehrnaz Aminifar

Abstract:

The relation between textural parameters and casein network on release of aromatic compounds was investigated over 90-days of ripening. Low DE maltodextrin and WPI were used to modify the textural properties of low fat brined cheese. Hardness, brittleness and compaction of casein network were affected by addition of maltodextrin and WPI. Textural properties and aroma release from cheese texture were affected by interaction of WPI protein-cheese protein and maltodexterin-cheese protein.

Keywords: aroma release, brined cheese, maltodexterin, WPI

Procedia PDF Downloads 331
7992 Proposal for an Inspection Tool for Damaged Structures after Disasters

Authors: Karim Akkouche, Amine Nekmouche, Leyla Bouzid

Abstract:

This study focuses on the development of a multifunctional Expert System (ES) called post-seismic damage inspection tool (PSDIT), a powerful tool which allows the evaluation, the processing, and the archiving of the collected data stock after earthquakes. PSDIT can be operated by two user types; an ordinary user (ingineer, expert, or architect) for the damage visual inspection and an administrative user for updating the knowledge and / or for adding or removing the ordinary user. The knowledge acquisition is driven by a hierarchical knowledge model, the Information from investigation reports and those acquired through feedback from expert / engineer questionnaires are part.

Keywords: .disaster, damaged structures, damage assessment, expert system

Procedia PDF Downloads 60
7991 Student Absenteeism as a Challenge for Inclusion: A Comparative Study of Primary Schools in an Urban City in India

Authors: Deepa Idnani

Abstract:

Attendance is an important factor in school success among children. Studies show that better attendance is related to higher academic achievement for students of all backgrounds, but particularly for children with lower socio-economic status. Beginning from the early years, students who attend school regularly score higher on tests than their peers who are frequently absent. The present study in different types of School In Delhi tries to highlight the impact of student absenteeism and the challenges it poses for the students. The study relies on Lewin ‘Model of Exclusion’ and tries to focus on the analysis of children with special needs and the inclusion and exclusion of students in the school.

Keywords: student absenteeism, pedagogy, learning, right to education act, exclusion

Procedia PDF Downloads 279
7990 Virtual Player for Learning by Observation to Assist Karate Training

Authors: Kazumoto Tanaka

Abstract:

It is well known that sport skill learning is facilitated by video observation of players’ actions in sports. The optimal viewpoint for the observation of actions depends on sport scenes. On the other hand, it is impossible to change viewpoint for the observation in general, because most videos are filmed from fixed points. The study has tackled the problem and focused on karate match as a first step. The study developed a method for observing karate player’s actions from any point of view by using 3D-CG model (i.e. virtual player) obtained from video images, and verified the effectiveness of the method on karate match.

Keywords: computer graphics, karate training, learning by observation, motion capture, virtual player

Procedia PDF Downloads 255
7989 Eating Behaviours in Islam and Mental Health: A Preventative Approach

Authors: Muhammad Rafiq, Lamae Zulfiqar, Nazish Idrees Chaudhary

Abstract:

A growing number of research focuses on healthy and unhealthy eating behaviors and their impact on health. It was intended to study the Islamic point of view on eating behavior, its impact on mental health and preventative strategies in the light of the Quran and Sunnah. Different articles and Islamic sayings related to eating behaviors and mental health were reviewed in detail. It was also revealed scientifically and through Islamic point of view that appropriate quantity, quality and timings of food have positive effects on mental health. Therefore, a 3Rs model of eating behaviors has been proposed.

Keywords: food intake, mental health, quality of food, quantity of food

Procedia PDF Downloads 215
7988 A Design for Application of Mobile Agent Technology to MicroService Architecture

Authors: Masayuki Higashino, Toshiya Kawato, Takao Kawamura

Abstract:

A monolithic service is based on the N-tier architecture in many cases. In order to divide a monolithic service into microservices, it is necessary to redefine a model as a new microservice by extracting and merging existing models across layers. Refactoring a monolithic service into microservices requires advanced technical capabilities, and it is a difficult way. This paper proposes a design and concept to ease the migration of a monolithic service to microservices using the mobile agent technology. Our proposed approach, mobile agents-based design and concept, enables to ease dividing and merging services.

Keywords: mobile agent, microservice, web service, distributed system

Procedia PDF Downloads 143
7987 The Modelling of Real Time Series Data

Authors: Valeria Bondarenko

Abstract:

We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.

Keywords: mathematical model, random process, Wiener process, fractional Brownian motion

Procedia PDF Downloads 333
7986 Assessment of ATC with Shunt FACTS Devices

Authors: Ashwani Kumar, Jitender Kumar

Abstract:

In this paper, an optimal power flow based approach has been applied for multi-transactions deregulated environment for ATC determination with SVC and STATCOM. The main contribution of the paper is (i) OPF based approach for evaluation of ATC with multi-transactions, (ii) ATC enhancement with FACTS devices viz. SVC and STATCOM for intact and line contingency cases, (iii) impact of ZIP load on ATC determination and comparison of ATC obtained with SVC and STATCOM. The results have been determined for intact and line contingency cases taking simultaneous as well as single transaction cases for IEEE 24 bus RTS.

Keywords: available transfer capability, FACTS devices, line contingency, multi-transactions, ZIP load model

Procedia PDF Downloads 567
7985 Application of De Novo Programming Approach for Optimizing the Business Process

Authors: Z. Babic, I. Veza, A. Balic, M. Crnjac

Abstract:

The linear programming model is sometimes difficult to apply in real business situations due to its assumption of proportionality. This paper shows an example of how to use De Novo programming approach instead of linear programming. In the De Novo programming, resources are not fixed like in linear programming but resource quantities depend only on available budget. Budget is a new, important element of the De Novo approach. Two different production situations are presented: increasing costs and quantity discounts of raw materials. The focus of this paper is on advantages of the De Novo approach in the optimization of production plan for production company which produces souvenirs made from famous stone from the island of Brac, one of the greatest islands from Croatia.

Keywords: business process, De Novo programming, optimizing, production

Procedia PDF Downloads 197
7984 Green Ports: Innovation Adopters or Innovation Developers

Authors: Marco Ferretti, Marcello Risitano, Maria Cristina Pietronudo, Lina Ozturk

Abstract:

A green port is the result of a sustainable long-term strategy adopted by an entire port infrastructure, therefore by the set of actors involved in port activities. The strategy aims to realise the development of sustainable port infrastructure focused on the reduction of negative environmental impacts without jeopardising economic growth. Green technology represents the core tool to implement sustainable solutions, however, they are not a magic bullet. Ports have always been integrated in the local territory affecting the environment in which they operate, therefore, the sustainable strategy should fit with the entire local systems. Therefore, adopting a sustainable strategy means to know how to involve and engage a wide stakeholders’ network (industries, production, markets, citizens, and public authority). The existing research on the topic has not well integrated this perspective with those of sustainability. Research on green ports have mixed the sustainability aspects with those on the maritime industry, neglecting dynamics that lead to the development of the green port phenomenon. We propose an analysis of green ports adopting the lens of ecosystem studies in the field of management. The ecosystem approach provides a way to model relations that enable green solutions and green practices in a port ecosystem. However, due to the local dimension of a port and the port trend on innovation, i.e., sustainable innovation, we draw to a specific concept of ecosystem, those on local innovation systems. More precisely, we explore if a green port is a local innovation system engaged in developing sustainable innovation with a large impact on the territory or merely an innovation adopter. To address this issue, we adopt a comparative case study selecting two innovative ports in Europe: Rotterdam and Genova. The case study is a research method focused on understanding the dynamics in a specific situation and can be used to provide a description of real circumstances. Preliminary results show two different approaches in supporting sustainable innovation: one represented by Rotterdam, a pioneer in competitiveness and sustainability, and the second one represented by Genoa, an example of technology adopter. The paper intends to provide a better understanding of how sustainable innovations are developed and in which manner a network of port and local stakeholder support this process. Furthermore, it proposes a taxonomy of green ports as developers and adopters of sustainable innovation, suggesting also best practices to model relationships that enable the port ecosystem in applying a sustainable strategy.

Keywords: green port, innovation, sustainability, local innovation systems

Procedia PDF Downloads 97
7983 Tick Induced Facial Nerve Paresis: A Narrative Review

Authors: Jemma Porrett

Abstract:

Background: We present a literature review examining the research surrounding tick paralysis resulting in facial nerve palsy. A case of an intra-aural paralysis tick bite resulting in unilateral facial nerve palsy is also discussed. Methods: A novel case of otoacariasis with associated ipsilateral facial nerve involvement is presented. Additionally, we conducted a review of the literature, and we searched the MEDLINE and EMBASE databases for relevant literature published between 1915 and 2020. Utilising the following keywords; 'Ixodes', 'Facial paralysis', 'Tick bite', and 'Australia', 18 articles were deemed relevant to this study. Results: Eighteen articles included in the review comprised a total of 48 patients. Patients' ages ranged from one year to 84 years of age. Ten studies estimated the possible duration between a tick bite and facial nerve palsy, averaging 8.9 days. Forty-one patients presented with a single tick within the external auditory canal, three had a single tick located on the temple or forehead region, three had post-auricular ticks, and one patient had a remarkable 44 ticks removed from the face, scalp, neck, back, and limbs. A complete ipsilateral facial nerve palsy was present in 45 patients, notably, in 16 patients, this occurred following tick removal. House-Brackmann classification was utilised in 7 patients; four patients with grade 4, one patient with grade three, and two patients with grade 2 facial nerve palsy. Thirty-eight patients had complete recovery of facial palsy. Thirteen studies were analysed for time to recovery, with an average time of 19 days. Six patients had partial recovery at the time of follow-up. One article reported improvement in facial nerve palsy at 24 hours, but no further follow-up was reported. One patient was lost to follow up, and one article failed to mention any resolution of facial nerve palsy. One patient died from respiratory arrest following generalized paralysis. Conclusions: Tick paralysis is a severe but preventable disease. Careful examination of the face, scalp, and external auditory canal should be conducted in patients presenting with otalgia and facial nerve palsy, particularly in tropical areas, to exclude the possibility of tick infestation.

Keywords: facial nerve palsy, tick bite, intra-aural, Australia

Procedia PDF Downloads 83
7982 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 35
7981 Application of Granular Computing Paradigm in Knowledge Induction

Authors: Iftikhar U. Sikder

Abstract:

This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.

Keywords: concept approximation, granular computing, reducts, rough set theory, rule induction

Procedia PDF Downloads 509
7980 A Professional Learning Model for Schools Based on School-University Research Partnering That Is Underpinned and Structured by a Micro-Credentialing Regime

Authors: David Lynch, Jake Madden

Abstract:

There exists a body of literature that reports on the many benefits of partnerships between universities and schools, especially in terms of teaching improvement and school reform. This is because such partnerships can build significant teaching capital, by deepening and expanding the skillsets and mindsets needed to create the connections that support ongoing and embedded teacher professional development and career goals. At the same time, this literature is critical of such initiatives when the partnership outcomes are short- term or one-sided, misaligned to fundamental problems, and not expressly focused on building the desired teaching capabilities. In response to this situation, research conducted by Professor David Lynch and his TeachLab research team, has begun to shed light on the strengths and limitations of school/university partnerships, via the identification of key conceptual elements that appear to act as critical partnership success factors. These elements are theorised as an inter-play between professional knowledge acquisition, readiness, talent management and organisational structure. However, knowledge of how these elements are established, and how they manifest within the school and its teaching workforce as an overall system, remains incomplete. Therefore, research designed to more clearly delineate these elements in relation to their impact on school/university partnerships is thus required. It is within this context that this paper reports on the development and testing of a Professional Learning (PL) model for schools and their teachers that incorporates school-university research partnering within a systematic, whole-of-school PL strategy that is underpinned and structured by a micro-credentialing (MC) regime. MC involves learning a narrow-focused certificate (a micro-credential) in a specific topic area (e.g., 'How to Differentiate Instruction for English as a second language Students') and embedded in the teacher’s day-to-day teaching work. The use of MC is viewed as important to the efficacy and sustainability of teacher PL because it (1) provides an evidence-based framework for teacher learning, (2) has the ability to promote teacher social capital and (3) engender lifelong learning in keeping professional skills current in an embedded and seamless to work manner. The associated research is centred on a primary school in Australia (P-6) that acted as an arena to co-develop, test/investigate and report on outcomes for teacher PL that uses MC to support a whole-of-school partnership with a university.

Keywords: teaching improvement, teacher professional learning, talent management, education partnerships, school-university research

Procedia PDF Downloads 64
7979 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting

Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero

Abstract:

In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling‎) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.

Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling

Procedia PDF Downloads 113
7978 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 124