Search results for: recurrent errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1271

Search results for: recurrent errors

581 Using Artificial Intelligence Method to Explore the Important Factors in the Reuse of Telecare by the Elderly

Authors: Jui-Chen Huang

Abstract:

This research used artificial intelligence method to explore elderly’s opinions on the reuse of telecare, its effect on their service quality, satisfaction and the relationship between customer perceived value and intention to reuse. This study conducted a questionnaire survey on the elderly. A total of 124 valid copies of a questionnaire were obtained. It adopted Backpropagation Network (BPN) to propose an effective and feasible analysis method, which is different from the traditional method. Two third of the total samples (82 samples) were taken as the training data, and the one third of the samples (42 samples) were taken as the testing data. The training and testing data RMSE (root mean square error) are 0.022 and 0.009 in the BPN, respectively. As shown, the errors are acceptable. On the other hand, the training and testing data RMSE are 0.100 and 0.099 in the regression model, respectively. In addition, the results showed the service quality has the greatest effects on the intention to reuse, followed by the satisfaction, and perceived value. This result of the Backpropagation Network method is better than the regression analysis. This result can be used as a reference for future research.

Keywords: artificial intelligence, backpropagation network (BPN), elderly, reuse, telecare

Procedia PDF Downloads 198
580 A Sentence-to-Sentence Relation Network for Recognizing Textual Entailment

Authors: Isaac K. E. Ampomah, Seong-Bae Park, Sang-Jo Lee

Abstract:

Over the past decade, there have been promising developments in Natural Language Processing (NLP) with several investigations of approaches focusing on Recognizing Textual Entailment (RTE). These models include models based on lexical similarities, models based on formal reasoning, and most recently deep neural models. In this paper, we present a sentence encoding model that exploits the sentence-to-sentence relation information for RTE. In terms of sentence modeling, Convolutional neural network (CNN) and recurrent neural networks (RNNs) adopt different approaches. RNNs are known to be well suited for sequence modeling, whilst CNN is suited for the extraction of n-gram features through the filters and can learn ranges of relations via the pooling mechanism. We combine the strength of RNN and CNN as stated above to present a unified model for the RTE task. Our model basically combines relation vectors computed from the phrasal representation of each sentence and final encoded sentence representations. Firstly, we pass each sentence through a convolutional layer to extract a sequence of higher-level phrase representation for each sentence from which the first relation vector is computed. Secondly, the phrasal representation of each sentence from the convolutional layer is fed into a Bidirectional Long Short Term Memory (Bi-LSTM) to obtain the final sentence representations from which a second relation vector is computed. The relations vectors are combined and then used in then used in the same fashion as attention mechanism over the Bi-LSTM outputs to yield the final sentence representations for the classification. Experiment on the Stanford Natural Language Inference (SNLI) corpus suggests that this is a promising technique for RTE.

Keywords: deep neural models, natural language inference, recognizing textual entailment (RTE), sentence-to-sentence relation

Procedia PDF Downloads 334
579 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Angel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors’. The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: demand forecasting, empirical distribution, propagation of error, Bogota

Procedia PDF Downloads 614
578 A Soft Error Rates (SER) Evaluation Method of Combinational Logic Circuit Based on Linear Energy Transfers

Authors: Man Li, Wanting Zhou, Lei Li

Abstract:

Communication stability is the primary concern of communication satellites. Communication satellites are easily affected by particle radiation to generate single event effects (SEE), which leads to soft errors (SE) of the combinational logic circuit. The existing research on soft error rates (SER) of the combined logic circuit is mostly based on the assumption that the logic gates being bombarded have the same pulse width. However, in the actual radiation environment, the pulse widths of the logic gates being bombarded are different due to different linear energy transfers (LET). In order to improve the accuracy of SER evaluation model, this paper proposes a soft error rate evaluation method based on LET. In this paper, the authors analyze the influence of LET on the pulse width of combinational logic and establish the pulse width model based on the LET. Based on this model, the error rate of test circuit ISCAS'85 is calculated. The effectiveness of the model is proved by comparing it with previous experiments.

Keywords: communication satellite, pulse width, soft error rates, LET

Procedia PDF Downloads 152
577 Trial of Faecal Microbial Transplantation for the Prevention of Canine Atopic Dermatitis

Authors: Caroline F. Moeser

Abstract:

The skin-gut axis defines the relationship between the intestinal microbiota and the development of pathological skin diseases. Low diversity within the gut can predispose to the development of allergic skin conditions, and a greater diversity of the gastrointestinal microflora has been associated with a reduction of skin flares in people with atopic dermatitis. Manipulation of the gut microflora has been used as a treatment option for several conditions in people, but there is limited data available on the use of faecal transplantation as a preventative measure in either people or dogs. Six, 4-month-old pups from a litter of ten were presented for diarrhea and/or signs of skin disease (chronic scratching, otitis externa). Of these pups, two were given probiotics with a resultant resolution of diarrhea. The other four pups were given faecal transplantation, either as a sole treatment or in combination with other treatments. Follow-up on the litter of ten pups was performed at 18 months of age. At this stage, the four pups that had received faecal transplantation had resolved all clinical signs and had no recurrence of either skin or gastrointestinal symptoms. Of the remaining six pups from the litter, all had developed at least one episode of Malassezia otitis externa within the period of 5 months to 18 months of age. Two pups had developed two Malassezia otitis infections, and one had developed three Malassezia otitis infections during this period. Favrot’s criteria for the diagnosis of canine atopic dermatitis include chronic or recurrent Malassezia infections by the age of three years. Early results from this litter predict a reduction in the development of canine atopic disease in dogs given faecal microbial transplantation. Follow-up studies at three years of age and within a larger population of dogs can enhance understanding of the impact of early faecal transplantation in the prevention of canine atopic dermatitis.

Keywords: canine atopic dermatitis, faecal microbial transplant, skin-gut axis, otitis

Procedia PDF Downloads 141
576 3D Simulation of the Twin-Aperture IRON Superconducting Quadrupole for Charm-Tau Factory

Authors: K. K. Riabchenko, T. V Rybitskaya, A. A. Starostenko

Abstract:

Sper Charm-Tau Factory is a double ring e+e- collider to be operated in the center-of-mass energy range from 2 to 6 GeV, with a peak luminosity of about 1035 cm-2s-1 (Crab Waist collision) and with longitudinally polarized electrons at the IP (interaction point). One of the important elements of the cτ-factory is the superconducting two-aperture quadrupole of the final focus. It was decided to make a full-scale prototype quadrupole. The main objectives of our study included: 1) 3D modeling of the quadrupole in the Opera program, 2) Optimization of the geometry of the quadrupole lens, 3) Study of the influence of magnetic properties and geometry of a quadrupole on integral harmonics. In addition to this, the ways of producing unwanted harmonics have been studied. In the course of this work, a 3D model of a two-aperture iron superconducting quadrupole lens was created. A three-dimensional simulation of the magnetic field was performed, and the geometrical parameters of the lens were selected. Calculations helped to find sources of possible errors and methods for correcting unwanted harmonics. In addition to this, calculations show that there are no obstacles to the production of a prototype lens.

Keywords: super cτ-factory, final focus, twin aperture quadrupole lens, integral harmonics

Procedia PDF Downloads 110
575 Individualized Emotion Recognition Through Dual-Representations and Ground-Established Ground Truth

Authors: Valentina Zhang

Abstract:

While facial expression is a complex and individualized behavior, all facial emotion recognition (FER) systems known to us rely on a single facial representation and are trained on universal data. We conjecture that: (i) different facial representations can provide different, sometimes complementing views of emotions; (ii) when employed collectively in a discussion group setting, they enable more accurate emotion reading which is highly desirable in autism care and other applications context sensitive to errors. In this paper, we first study FER using pixel-based DL vs semantics-based DL in the context of deepfake videos. Our experiment indicates that while the semantics-trained model performs better with articulated facial feature changes, the pixel-trained model outperforms on subtle or rare facial expressions. Armed with these findings, we have constructed an adaptive FER system learning from both types of models for dyadic or small interacting groups and further leveraging the synthesized group emotions as the ground truth for individualized FER training. Using a collection of group conversation videos, we demonstrate that FER accuracy and personalization can benefit from such an approach.

Keywords: neurodivergence care, facial emotion recognition, deep learning, ground truth for supervised learning

Procedia PDF Downloads 128
574 Use of computer and peripherals in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar Mehrafarin, Reza Mehrafarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: archaeological surveys, computer use, iran, modern technologies, sistan

Procedia PDF Downloads 63
573 A Monocular Measurement for 3D Objects Based on Distance Area Number and New Minimize Projection Error Optimization Algorithms

Authors: Feixiang Zhao, Shuangcheng Jia, Qian Li

Abstract:

High-precision measurement of the target’s position and size is one of the hotspots in the field of vision inspection. This paper proposes a three-dimensional object positioning and measurement method using a monocular camera and GPS, namely the Distance Area Number-New Minimize Projection Error (DAN-NMPE). Our algorithm contains two parts: DAN and NMPE; specifically, DAN is a picture sequence algorithm, NMPE is a relatively positive optimization algorithm, which greatly improves the measurement accuracy of the target’s position and size. Comprehensive experiments validate the effectiveness of our proposed method on a self-made traffic sign dataset. The results show that with the laser point cloud as the ground truth, the size and position errors of the traffic sign measured by this method are ± 5% and 0.48 ± 0.3m, respectively. In addition, we also compared it with the current mainstream method, which uses a monocular camera to locate and measure traffic signs. DAN-NMPE attains significant improvements compared to existing state-of-the-art methods, which improves the measurement accuracy of size and position by 50% and 15.8%, respectively.

Keywords: monocular camera, GPS, positioning, measurement

Procedia PDF Downloads 127
572 Potential of Hyperion (EO-1) Hyperspectral Remote Sensing for Detection and Mapping Mine-Iron Oxide Pollution

Authors: Abderrazak Bannari

Abstract:

Acid Mine Drainage (AMD) from mine wastes and contaminations of soils and water with metals are considered as a major environmental problem in mining areas. It is produced by interactions of water, air, and sulphidic mine wastes. This environment problem results from a series of chemical and biochemical oxidation reactions of sulfide minerals e.g. pyrite and pyrrhotite. These reactions lead to acidity as well as the dissolution of toxic and heavy metals (Fe, Mn, Cu, etc.) from tailings waste rock piles, and open pits. Soil and aquatic ecosystems could be contaminated and, consequently, human health and wildlife will be affected. Furthermore, secondary minerals, typically formed during weathering of mine waste storage areas when the concentration of soluble constituents exceeds the corresponding solubility product, are also important. The most common secondary mineral compositions are hydrous iron oxide (goethite, etc.) and hydrated iron sulfate (jarosite, etc.). The objectives of this study focus on the detection and mapping of MIOP in the soil using Hyperion EO-1 (Earth Observing - 1) hyperspectral data and constrained linear spectral mixture analysis (CLSMA) algorithm. The abandoned Kettara mine, located approximately 35 km northwest of Marrakech city (Morocco) was chosen as study area. During 44 years (from 1938 to 1981) this mine was exploited for iron oxide and iron sulphide minerals. Previous studies have shown that Kettara surrounding soils are contaminated by heavy metals (Fe, Cu, etc.) as well as by secondary minerals. To achieve our objectives, several soil samples representing different MIOP classes have been resampled and located using accurate GPS ( ≤ ± 30 cm). Then, endmembers spectra were acquired over each sample using an Analytical Spectral Device (ASD) covering the spectral domain from 350 to 2500 nm. Considering each soil sample separately, the average of forty spectra was resampled and convolved using Gaussian response profiles to match the bandwidths and the band centers of the Hyperion sensor. Moreover, the MIOP content in each sample was estimated by geochemical analyses in the laboratory, and a ground truth map was generated using simple Kriging in GIS environment for validation purposes. The acquired and used Hyperion data were corrected for a spatial shift between the VNIR and SWIR detectors, striping, dead column, noise, and gain and offset errors. Then, atmospherically corrected using the MODTRAN 4.2 radiative transfer code, and transformed to surface reflectance, corrected for sensor smile (1-3 nm shift in VNIR and SWIR), and post-processed to remove residual errors. Finally, geometric distortions and relief displacement effects were corrected using a digital elevation model. The MIOP fraction map was extracted using CLSMA considering the entire spectral range (427-2355 nm), and validated by reference to the ground truth map generated by Kriging. The obtained results show the promising potential of the proposed methodology for the detection and mapping of mine iron oxide pollution in the soil.

Keywords: hyperion eo-1, hyperspectral, mine iron oxide pollution, environmental impact, unmixing

Procedia PDF Downloads 210
571 Colour Recognition Pen Technology in Dental Technique and Dental Laboratories

Authors: M. Dabirinezhad, M. Bayat Pour, A. Dabirinejad

Abstract:

Recognition of the color spectrum of the teeth plays a significant role in the dental laboratories to produce dentures. Since there are various types and colours of teeth for each patient, there is a need to specify the exact and the most suitable colour to produce a denture. Usually, dentists utilize pallets to identify the color that suits a patient based on the color of the adjacent teeth. Consistent with this, there can be human errors by dentists to recognize the optimum colour for the patient, and it can be annoying for the patient. According to the statistics, there are some claims from the patients that they are not satisfied by the colour of their dentures after the installation of the denture in their mouths. This problem emanates from the lack of sufficient accuracy during the colour recognition process of denture production. The colour recognition pen (CRP) is a technology to distinguish the colour spectrum of the intended teeth with the highest accuracy. CRP is equipped with a sensor that is capable to read and analyse a wide range of spectrums. It is also connected to a database that contains all the spectrum ranges, which exist in the market. The database is editable and updatable based on market requirements. Another advantage of this invention can be mentioned as saving time for the patients since there is no need to redo the denture production in case of failure on the first try.

Keywords: colour recognition pen, colour spectrum, dental laboratory, denture

Procedia PDF Downloads 183
570 Repeatable Scalable Business Models: Can Innovation Drive an Entrepreneurs Un-Validated Business Model?

Authors: Paul Ojeaga

Abstract:

Can the level of innovation use drive un-validated business models across regions? To what extent does industrial sector attractiveness drive firm’s success across regions at the time of start-up? This study examines the role of innovation on start-up success in six regions of the world (namely Sub Saharan Africa, the Middle East and North Africa, Latin America, South East Asia Pacific, the European Union and the United States representing North America) using macroeconomic variables. While there have been studies using firm level data, results from such studies are not suitable for national policy decisions. The need to drive a regional innovation policy also begs for an answer, therefore providing room for this study. Results using dynamic panel estimation show that innovation counts in the early infancy stage of new business life cycle. The results are robust even after controlling for time fixed effects and the study present variance-covariance estimation robust standard errors.

Keywords: industrial economics, un-validated business models, scalable models, entrepreneurship

Procedia PDF Downloads 270
569 A Rare Case of Metastatic Basal Cell Carcinoma

Authors: Nitesh Kumar, Eoin Twohig, jasparl cheema, Sadiq mawji, Yousif al najjar

Abstract:

Basal cell carcinoma (BCC) is the commonest cutaneous malignancy affecting humans. Despite this, distant spread is exceptionally rare. Metastatic BCC (mBCC) is estimated to occur in 0.0028 - 0.5%. it aim to illustrate with the aid of histological slides, a case of mBCC occurring in a fit and well 67-year-old. Initial diagnosis of desmoplastic BCC was made in 2006 from a scalp biopsy with the lesion then being excised. Re-excision of local recurrence was undertaken the following year. In 2014 the patient presented with an ipsilateral level 2a mass. Fine Needle Aspiration raised the suspicion of metastatic carcinoma. The patient had excision of two nodes from the left neck alongside pharyngeal tonsillectomy and tongue base biopsies. Histologically, the nodes closely resembled the immunophenotype of the initial scalp lesion. The patient subsequently had a modified radical neck dissection, and residual mBCC was excised from the left Sternocleidomastoid muscle. In 2023 the patient developed haematuria. On further investigation bilateral lung lesions on CT were noted with subsequent biopsy confirming mBCC. Spinal and renal lesions have also been found. Histopathology showed clear resemblance of the lung metastases to both those in the neck and the primary (scalp BCC) – with no squamous differentiation seen. The time span from primary to occurrence of lung metastasis (18 years) affirms the indolent and slow growing nature of BCC.  This case fulfils Lattes and Kessler diagnostic criteria. High risk cases are described as those with advanced local presentation, primary tumour on the Head and Neck and locally recurrent lesions.

Keywords: BCC, metastasis, rare, skin cancer

Procedia PDF Downloads 39
568 Design of Enhanced Adaptive Filter for Integrated Navigation System of FOG-SINS and Star Tracker

Authors: Nassim Bessaad, Qilian Bao, Zhao Jiangkang

Abstract:

The fiber optics gyroscope in the strap-down inertial navigation system (FOG-SINS) suffers from precision degradation due to the influence of random errors. In this work, an enhanced Allan variance (AV) stochastic modeling method combined with discrete wavelet transform (DWT) for signal denoising is implemented to estimate the random process in the FOG signal. Furthermore, we devise a measurement-based iterative adaptive Sage-Husa nonlinear filter with augmented states to integrate a star tracker sensor with SINS. The proposed filter adapts the measurement noise covariance matrix based on the available data. Moreover, the enhanced stochastic modeling scheme is invested in tuning the process noise covariance matrix and the augmented state Gauss-Markov process parameters. Finally, the effectiveness of the proposed filter is investigated by employing the collected data in laboratory conditions. The result shows the filter's improved accuracy in comparison with the conventional Kalman filter (CKF).

Keywords: inertial navigation, adaptive filtering, star tracker, FOG

Procedia PDF Downloads 69
567 Simulation as a Problem-Solving Spotter for System Reliability

Authors: Wheyming Tina Song, Chi-Hao Hong, Peisyuan Lin

Abstract:

An important performance measure for stochastic manufacturing networks is the system reliability, defined as the probability that the production output meets or exceeds a specified demand. The system parameters include the capacity of each workstation and numbers of the conforming parts produced in each workstation. We establish that eighteen archival publications, containing twenty-one examples, provide incorrect values of the system reliability. The author recently published the Song Rule, which provides the correct analytical system-reliability value; it is, however, computationally inefficient for large networks. In this paper, we use Monte Carlo simulation (implemented in C and Flexsim) to provide estimates for the above-mentioned twenty-one examples. The simulation estimates are consistent with the analytical solution for small networks but is computationally efficient for large networks. We argue here for three advantages of Monte Carlo simulation: (1) understanding stochastic systems, (2) validating analytical results, and (3) providing estimates even when analytical and numerical approaches are overly expensive in computation. Monte Carlo simulation could have detected the published analysis errors.

Keywords: Monte Carlo simulation, analytical results, leading digit rule, standard error

Procedia PDF Downloads 346
566 Collision Avoidance Maneuvers for Vessels Navigating through Traffic Separation Scheme

Authors: Aswin V. J., Sreeja S., R. Harikumar

Abstract:

Ship collision is one of the major concerns while navigating in the ocean. In congested sea routes where there are hectic offshore operations, ships are often forced to take close encounter maneuvers. Maritime rules for preventing collision at sea are defined in the International Regulations for Preventing Collision at Sea. Traffic Separation Schemes (TSS) are traffic management route systems ruled by International Maritime Organization (IMO), where the traffic lanes indicate the general direction of traffic flow. The Rule 10 of International Regulations for Preventing Collision at Sea prescribes the conduct of vessels while navigating through TSS. But no quantitative criteria regarding the procedures to detect and evaluate collision risk is specified in International Regulations for Preventing Collision at Sea. Most of the accidents that occur are due to operational errors affected by human factors such as lack of experience and loss of situational awareness. In open waters, the traffic density is less when compared to that in TSS, and hence the vessels can be operated in autopilot mode. A collision avoidance method that uses the possible obstacle trajectories in advance to predict “collision occurrence” and can generate suitable maneuvers for collision avoidance is presented in this paper. The suitable course and propulsion changes that can be used in a TSS considering International Regulations for Preventing Collision at Sea are found out for various obstacle scenarios.

Keywords: collision avoidance, maneuvers, obstacle trajectories, traffic separation scheme

Procedia PDF Downloads 64
565 Contact Phenomena in Medieval Business Texts

Authors: Carmela Perta

Abstract:

Among the studies flourished in the field of historical sociolinguistics, mainly in the strand devoted to English history, during its Medieval and early modern phases, multilingual texts had been analysed using theories and models coming from contact linguistics, thus applying synchronic models and approaches to the past. This is true also in the case of contact phenomena which would transcend the writing level involving the language systems implicated in contact processes to the point of perceiving a new variety. This is the case for medieval administrative-commercial texts in which, according to some Scholars, the degree of fusion of Anglo-Norman, Latin and middle English is so high a mixed code emerges, and there are recurrent patterns of mixed forms. Interesting is a collection of multilingual business writings by John Balmayn, an Englishman overseeing a large shipment in Tuscany, namely the Cantelowe accounts. These documents display various analogies with multilingual texts written in England in the same period; in fact, the writer seems to make use of the above-mentioned patterns, with Middle English, Latin, Anglo-Norman, and the newly added Italian. Applying an atomistic yet dynamic approach to the study of contact phenomena, we will investigate these documents, trying to explore the nature of the switching forms they contain from an intra-writer variation perspective. After analysing the accounts and the type of multilingualism in them, we will take stock of the assumed mixed code nature, comparing the characteristics found in this genre with modern assumptions. The aim is to evaluate the possibility to consider the switching forms as core elements of a mixed code, used as professional variety among merchant communities, or whether such texts should be analysed from a switching perspective.

Keywords: historical sociolinguistics, historical code switching, letters, medieval england

Procedia PDF Downloads 59
564 Target and Biomarker Identification Platform to Design New Drugs against Aging and Age-Related Diseases

Authors: Peter Fedichev

Abstract:

We studied fundamental aspects of aging to develop a mathematical model of gene regulatory network. We show that aging manifests itself as an inherent instability of gene network leading to exponential accumulation of regulatory errors with age. To validate our approach we studied age-dependent omic data such as transcriptomes, metabolomes etc. of different model organisms and humans. We build a computational platform based on our model to identify the targets and biomarkers of aging to design new drugs against aging and age-related diseases. As biomarkers of aging, we choose the rate of aging and the biological age since they completely determine the state of the organism. Since rate of aging rapidly changes in response to an external stress, this kind of biomarker can be useful as a tool for quantitative efficacy assessment of drugs, their combinations, dose optimization, chronic toxicity estimate, personalized therapies selection, clinical endpoints achievement (within clinical research), and death risk assessments. According to our model, we propose a method for targets identification for further interventions against aging and age-related diseases. Being a biotech company, we offer a complete pipeline to develop an anti-aging drug-candidate.

Keywords: aging, longevity, biomarkers, senescence

Procedia PDF Downloads 263
563 Artificial Neural Network Modeling of a Closed Loop Pulsating Heat Pipe

Authors: Vipul M. Patel, Hemantkumar B. Mehta

Abstract:

Technological innovations in electronic world demand novel, compact, simple in design, less costly and effective heat transfer devices. Closed Loop Pulsating Heat Pipe (CLPHP) is a passive phase change heat transfer device and has potential to transfer heat quickly and efficiently from source to sink. Thermal performance of a CLPHP is governed by various parameters such as number of U-turns, orientations, input heat, working fluids and filling ratio. The present paper is an attempt to predict the thermal performance of a CLPHP using Artificial Neural Network (ANN). Filling ratio and heat input are considered as input parameters while thermal resistance is set as target parameter. Types of neural networks considered in the present paper are radial basis, generalized regression, linear layer, cascade forward back propagation, feed forward back propagation; feed forward distributed time delay, layer recurrent and Elman back propagation. Linear, logistic sigmoid, tangent sigmoid and Radial Basis Gaussian Function are used as transfer functions. Prediction accuracy is measured based on the experimental data reported by the researchers in open literature as a function of Mean Absolute Relative Deviation (MARD). The prediction of a generalized regression ANN model with spread constant of 4.8 is found in agreement with the experimental data for MARD in the range of ±1.81%.

Keywords: ANN models, CLPHP, filling ratio, generalized regression, spread constant

Procedia PDF Downloads 274
562 Estimation of Population Mean Using Characteristics of Poisson Distribution: An Application to Earthquake Data

Authors: Prayas Sharma

Abstract:

This paper proposed a generalized class of estimators, an exponential class of estimators based on the adaption of Sharma and Singh (2015) and Solanki and Singh (2013), and a simple difference estimator for estimating unknown population mean in the case of Poisson distributed population in simple random sampling without replacement. The expressions for mean square errors of the proposed classes of estimators are derived from the first order of approximation. It is shown that the adapted version of Solanki and Singh (2013), the exponential class of estimator, is always more efficient than the usual estimator, ratio, product, exponential ratio, and exponential product type estimators and equally efficient to simple difference estimator. Moreover, the adapted version of Sharma and Singh's (2015) estimator is always more efficient than all the estimators available in the literature. In addition, theoretical findings are supported by an empirical study to show the superiority of the constructed estimators over others with an application to earthquake data of Turkey.

Keywords: auxiliary attribute, point bi-serial, mean square error, simple random sampling, Poisson distribution

Procedia PDF Downloads 137
561 Robust ResNets for Chemically Reacting Flows

Authors: Randy Price, Harbir Antil, Rainald Löhner, Fumiya Togashi

Abstract:

Chemically reacting flows are common in engineering applications such as hypersonic flow, combustion, explosions, manufacturing process, and environmental assessments. The number of reactions in combustion simulations can exceed 100, making a large number of flow and combustion problems beyond the capabilities of current supercomputers. Motivated by this, deep neural networks (DNNs) will be introduced with the goal of eventually replacing the existing chemistry software packages with DNNs. The DNNs used in this paper are motivated by the Residual Neural Network (ResNet) architecture. In the continuum limit, ResNets become an optimization problem constrained by an ODE. Such a feature allows the use of ODE control techniques to enhance the DNNs. In this work, DNNs are constructed, which update the species un at the nᵗʰ timestep to uⁿ⁺¹ at the n+1ᵗʰ timestep. Parallel DNNs are trained for each species, taking in uⁿ as input and outputting one component of uⁿ⁺¹. These DNNs are applied to multiple species and reactions common in chemically reacting flows such as H₂-O₂ reactions. Experimental results show that the DNNs are able to accurately replicate the dynamics in various situations and in the presence of errors.

Keywords: chemical reacting flows, computational fluid dynamics, ODEs, residual neural networks, ResNets

Procedia PDF Downloads 104
560 Influence of Atmospheric Pollutants on Child Respiratory Disease in Cartagena De Indias, Colombia

Authors: Jose A. Alvarez Aldegunde, Adrian Fernandez Sanchez, Matthew D. Menden, Bernardo Vila Rodriguez

Abstract:

Up to five statistical pre-processings have been carried out considering the pollutant records of the stations present in Cartagena de Indias, Colombia, also taking into account the childhood asthma incidence surveys conducted in hospitals in the city by the Health Ministry of Colombia for this study. These pre-processings have consisted of different techniques such as the determination of the quality of data collection, determination of the quality of the registration network, identification and debugging of errors in data collection, completion of missing data and purified data, as well as the improvement of the time scale of records. The characterization of the quality of the data has been conducted by means of density analysis of the pollutant registration stations using ArcGis Software and through mass balance techniques, making it possible to determine inconsistencies in the records relating the registration data between stations following the linear regression. The results obtained in this process have highlighted the positive quality in the pollutant registration process. Consequently, debugging of errors has allowed us to identify certain data as statistically non-significant in the incidence and series of contamination. This data, together with certain missing records in the series recorded by the measuring stations, have been completed by statistical imputation equations. Following the application of these prior processes, the basic series of incidence data for respiratory disease and pollutant records have allowed the characterization of the influence of pollutants on respiratory diseases such as, for example, childhood asthma. This characterization has been carried out using statistical correlation methods, including visual correlation, simple linear regression correlation and spectral analysis with PAST Software which identifies maximum periodicity cycles and minimums under the formula of the Lomb periodgram. In relation to part of the results obtained, up to eleven maximums and minimums considered contemporary between the incidence records and the particles have been identified taking into account the visual comparison. The spectral analyses that have been performed on the incidence and the PM2.5 have returned a series of similar maximum periods in both registers, which are at a maximum during a period of one year and another every 25 days (0.9 and 0.07 years). The bivariate analysis has managed to characterize the variable "Daily Vehicular Flow" in the ninth position of importance of a total of 55 variables. However, the statistical correlation has not obtained a favorable result, having obtained a low value of the R2 coefficient. The series of analyses conducted has demonstrated the importance of the influence of pollutants such as PM2.5 in the development of childhood asthma in Cartagena. The quantification of the influence of the variables has been able to determine that there is a 56% probability of dependence between PM2.5 and childhood respiratory asthma in Cartagena. Considering this justification, the study could be completed through the application of the BenMap Software, throwing a series of spatial results of interpolated values of the pollutant contamination records that exceeded the established legal limits (represented by homogeneous units up to the neighborhood level) and results of the impact on the exacerbation of pediatric asthma. As a final result, an economic estimate (in Colombian Pesos) of the monthly and individual savings derived from the percentage reduction of the influence of pollutants in relation to visits to the Hospital Emergency Room due to asthma exacerbation in pediatric patients has been granted.

Keywords: Asthma Incidence, BenMap, PM2.5, Statistical Analysis

Procedia PDF Downloads 103
559 Extraskeletal Ewing Sarcoma- Experience in a Tertiary Cancer Care Centre of India

Authors: Himanshu Rohela

Abstract:

BACKGROUND: Ewing sarcoma can arise in either bone or soft tissue. Extraskeletal Ewing sarcoma (EES) is an uncommon primary tumor of the soft tissues, accounting for 20 30% of all reported cases of ES. AIM: Was to investigate demographic distribution, survival analysis and factors affecting the survival and recurrence in patients of EES. METHODS: Retrospective study of 19 biopsy-proven EES was performed. Overall survival (OS) using log-rank test and factors affecting OS and local recurrence (LR) were evaluated for the entire cohort. RESULTS: Patients with EES had a mean age of 19.5 and it was more commonly seen in males (63%). Axial location (58%) and solitary presentation (84%) were more common. The average size was 11 cm, 3 of 19 were metastatic at presentation, with the lung beings the most common site for metastasis. 17 received NACT, 16 with VAC-IE regimen and 1 underwent a second line with GEM/DOCE regimen. Unplanned surgery was done in 2 of 19. 3 patients received definitive RT and 13 underwent surgical-wide local excision. 2 of 13 showed good response to NACT. 10 patients required readmission out of which 6 patients had chemotherapy-related complications, 2 had surgical site complications and one patient developed secondary AML post-completion of treatment. A total of 4 patients had a recurrence. One had local recurrence alone, one had distant recurrence alone and 2 patients had a distant and local recurrence both. Tumor size >10 cm, axial location, and previous unplanned surgery was associated with higher LR rate. The mean overall survival was 32 months (2.66 years), with higher rates seen in non-metastatic and non-recurrent settings. CONCLUSIONS: Early and accurate diagnosis is the key to the management of EES, with promising results seen via NACT and RO resection regimens. But further studies with larger study groups are needed to standardize the treatment protocol and evaluate its efficacy.

Keywords: Ewings, sarcoma, extraskeletal, chemotherapy

Procedia PDF Downloads 60
558 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality

Authors: Peregrine James Dalziel, Philip Vu Tran

Abstract:

Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.

Keywords: workflow, quality, administration, CT, staffing

Procedia PDF Downloads 97
557 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline Maria Ribeiro Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.

Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer

Procedia PDF Downloads 286
556 Systematic Review and Meta-Analysis of Mid-Term Survival, and Recurrent Mitral Regurgitation for Robotic-Assisted Mitral Valve Repair

Authors: Ramanen Sugunesegran, Michael L. Williams

Abstract:

Over the past two decades surgical approaches for mitral valve (MV) disease have evolved with the advent of minimally invasive techniques. Robotic mitral valve repair (RMVr) safety and efficacy has been well documented, however, mid- to long-term data are limited. The aim of this review was to provide a comprehensive analysis of the available mid- to long-term term data for RMVr. Electronic searches of five databases were performed to identify all relevant studies reporting minimum 5-year data on RMVr. Pre-defined primary outcomes of interest were overall survival, freedom from MV reoperation and freedom from moderate or worse mitral regurgitation (MR) at 5-years or more post-RMVr. A meta-analysis of proportions or means was performed, utilizing a random effects model, to present the data. Kaplan-Meier curves were aggregated using reconstructed individual patient data. Nine studies totaling 3,300 patients undergoing RMVr were identified. Rates of overall survival at 1-, 5- and 10-years were 99.2%, 97.4% and 92.3%, respectively. Freedom from MV reoperation at 8-years post RMVr was 95.0%. Freedom from moderate or worse MR at 7-years was 86.0%. Rates of early post-operative complications were low with only 0.2% all-cause mortality and 1.0% cerebrovascular accident. Reoperation for bleeding was low at 2.2% and successful RMVr was 99.8%. Mean intensive care unit and hospital stay were 22.4 hours and 5.2 days, respectively. RMVr is a safe procedure with low rates of early mortality and other complications. It can be performed with low complication rates in high volume, experienced centers. Evaluation of available mid-term data post-RMVr suggests favorable rates of overall survival, freedom from MV reoperation and freedom from moderate or worse MR recurrence.

Keywords: mitral valve disease, mitral valve repair, robotic cardiac surgery, robotic mitral valve repair

Procedia PDF Downloads 71
555 Rethinking Urban Floodplain Management: The Case of Colombo, Sri Lanka

Authors: Malani Herath, Sohan Wijesekera, Jagath Munasingha

Abstract:

The impact of recent floods become significant, and the extraordinary flood events cause considerable damage to lives, properties, environment and negatively affect the whole development of Colombo urban region. Even though the Colombo urban region experiences recurrent flood impacts, several spatial planning interventions have been taken from time to time since early 20th century. All past plans have adopted a traditional approach to flood management, using infrastructural measures to reduce the chance of flooding together with rigid planning regulations. The existing flood risk management practices do not operate to be acceptable by the local community particular the urban poor. Researchers have constantly reported the differences in estimations of flood risk, priorities, concerns of experts and the local community. Risk-based decision making in flood management is not only a matter of technical facts; it has a significant bearing on how flood risk is viewed by local community and individuals. Moreover, sustainable flood management is an integrated approach, which highlights joint actions of experts and community. This indicates the necessity of further societal discussion on the acceptable level of flood risk indicators to prioritize and identify the appropriate flood management measures in Colombo. The understanding and evaluation of flood risk by local people are important to integrate in the decision-making process. This research questioned about the gap between the acceptable level of flood risk to spatial planners and to the local communities in Colombo. A comprehensive literature review was conducted to prepare a framework to analyze the public perception in Colombo. This research work identifies the factors that affect the variation of flood risk and acceptable levels to both local community and planning authorities.

Keywords: Colombo basin, public perception, urban flood risk, multi-criteria analysis

Procedia PDF Downloads 297
554 Web and Android-Based Applications as a Breakthrough in Preventing Non-System Fault Disturbances Due to Work Errors in the Transmission Unit

Authors: Dhany Irvandy, Ary Gemayel, Mohammad Azhar, Leidenti Dwijayanti, Iif Hafifah

Abstract:

Work safety is among the most important things in work execution. Unsafe conditions and actions are priorities in accident prevention in the world of work, especially in the operation and maintenance of electric power transmission. Considering the scope of work, operational work in the transmission has a very high safety risk. Various efforts have been made to avoid work accidents. However, accidents or disturbances caused by non-conformities in work implementation still often occur. Unsafe conditions or actions can cause these. Along with the development of technology, website-based applications and mobile applications have been widely used as a medium to monitor work in real-time and by more people. This paper explains the use of web and android-based applications to monitor work and work processes in the field to prevent work accidents or non-system fault disturbances caused by non-conformity of work implementation with predetermined work instructions. Because every job is monitored in real-time, recorded in time and documented systemically, this application can reduce the occurrence of possible unsafe actions carried out by job executors that can cause disruption or work accidents.

Keywords: work safety, unsafe action, application, non-system fault, real-time.

Procedia PDF Downloads 9
553 Quantitative Research on the Effects of Following Brands on Twitter on Consumer Brand Attitude

Authors: Yujie Wei

Abstract:

Twitter uses a variety of narrative methods (e.g., messages, featured videos, music, and actual events) to strengthen its cultivation effect. Consumers are receiving mass-produced brand stores or images made by brand managers according to strict market specifications. Drawing on the cultivation theory, this quantitative research investigates how following a brand on Twitter for 12 weeks can cultivate their attitude toward the brand and influence their purchase intentions. We conducted three field experiments on Twitter to test the cultivation effects of following a brand for 12 weeks on consumer attitude toward the followed brand. The cultivation effects were measured by comparing the changes in consumer attitudes before and after they have followed a brand over time. The findings of our experiments suggest that when consumers are exposed to a brand’s stable, pervasive, and recurrent tweets on Twitter for 12 weeks, their attitude toward a brand can be significantly changed, which confirms the cultivating effects on consumer attitude. Also, the results indicate that branding activities on Twitter, when properly implemented, can be very effective in changing consumer attitudes toward a brand, increasing the purchase intentions, and increasing their willingness to spread the word-of-mouth for the brand on social media. The cultivation effects are moderated by brand type and consumer age. The research provides three major marketing implications. First, Twitter marketers should create unique content to engage their brand followers to change their brand attitude through steady, cumulative exposure to the branding activities on Twitter. Second, there is a significant moderating effect of brand type on the cultivation effects, so Twitter marketers should align their branding content with the brand type to better meet the needs and wants of consumers for different types of brands. Finally, Twitter marketers should adapt their tweeting strategies according to the media consumption preferences of different age groups of their target markets. This empirical research proves that content is king.

Keywords: tweeting, cultivation theory, consumer brand attitude, purchase intentions, word-of-mouth

Procedia PDF Downloads 95
552 A Mixed Expert Evaluation System and Dynamic Interval-Valued Hesitant Fuzzy Selection Approach

Authors: Hossein Gitinavard, Mohammad Hossein Fazel Zarandi

Abstract:

In the last decades, concerns about the environmental issues lead to professional and academic efforts on green supplier selection problems. In this sake, one of the main issues in evaluating the green supplier selection problems, which could increase the uncertainty, is the preferences of the experts' judgments about the candidate green suppliers. Therefore, preparing an expert system to evaluate the problem based on the historical data and the experts' knowledge can be sensible. This study provides an expert evaluation system to assess the candidate green suppliers under selected criteria in a multi-period approach. In addition, a ranking approach under interval-valued hesitant fuzzy set (IVHFS) environment is proposed to select the most appropriate green supplier in planning horizon. In the proposed ranking approach, the IVHFS and the last aggregation approach are considered to margin the errors and to prevent data loss, respectively. Hence, a comparative analysis is provided based on an illustrative example to show the feasibility of the proposed approach.

Keywords: green supplier selection, expert system, ranking approach, interval-valued hesitant fuzzy setting

Procedia PDF Downloads 313