Search results for: incomplete%20count%20data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 275

Search results for: incomplete%20count%20data

275 A Pragmatic Reading of the Verb "Kana" and Its Meanings

Authors: Manal M. H. Said Najjar

Abstract:

Arab Grammarians stood at variance with regard to the definition of kana (which might equal was, were, the past form of “be” in English). Kana was considered as a verb, a particle, or a quasi-verb by different scholars; others saw it as an auxiliary verb; while some other scholars categorized kana as one of the incomplete verbs or (Afa’al naqisa) based on two different claims: first, a considerable group of grammarians saw kana as fie’l naqis or an incomplete verb since it indicates time, but not the event or action itself. Second, kana requires a predicate (xabar) to complete the meaning, i.e., it does not suffice itself with a noun in the nominal sentence. This study argues that categorizing the verb kana as fie’l naqis or an incomplete verb is inaccurate and confusing since the term “incomplete” does not agree with its characteristics, meanings, and temporal indications. Moreover, interpreting kana as a past verb is also inaccurate. kana كان (derived from the absolute action of being كون) is considered unique and the most comprehensive verb, encompassing all tenses of the past, present, and future within the dimensions of continuity and eternity of all possible actions under “being”.

Keywords: pragmatics, kana, context, Arab grammarians, meaning, fie’l naqis

Procedia PDF Downloads 53
274 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 372
273 Structural Behavior of Incomplete Box Girder Bridges Subjected to Unpredicted Loads

Authors: E. H. N. Gashti, J. Razzaghi, K. Kujala

Abstract:

In general, codes and regulations consider seismic loads only for completed structures of the bridges while, evaluation of incomplete structure of bridges, especially those constructed by free cantilever method, under these loads is also of great importance. Hence, this research tried to study the behavior of incomplete structure of common bridge type (box girder bridge), in construction phase under vertical seismic loads. Subsequently, the paper provided suitable guidelines and solutions to withstand this destructive phenomena. Research results proved that use of preventive methods can significantly reduce the stresses resulted from vertical seismic loads in box cross sections to an acceptable range recommended by design codes.

Keywords: box girder bridges, prestress loads, free cantilever method, seismic loads, construction phase

Procedia PDF Downloads 310
272 Zero Cross-Correlation Codes Based on Balanced Incomplete Block Design: Performance Analysis and Applications

Authors: Garadi Ahmed, Boubakar S. Bouazza

Abstract:

The Zero Cross-Correlation (C, w) code is a family of binary sequences of length C and constant Hamming-weight, the cross correlation between any two sequences equal zero. In this paper, we evaluate the performance of ZCC code based on Balanced Incomplete Block Design (BIBD) for Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA) system using direct detection. The BER obtained is better than 10-9 for five simultaneous users.

Keywords: spectral amplitude coding-optical code-division-multiple-access (SAC-OCDMA), phase induced intensity noise (PIIN), balanced incomplete block design (BIBD), zero cross-correlation (ZCC)

Procedia PDF Downloads 335
271 A PROMETHEE-BELIEF Approach for Multi-Criteria Decision Making Problems with Incomplete Information

Authors: H. Moalla, A. Frikha

Abstract:

Multi-criteria decision aid methods consider decision problems where numerous alternatives are evaluated on several criteria. These methods are used to deal with perfect information. However, in practice, it is obvious that this information requirement is too much strict. In fact, the imperfect data provided by more or less reliable decision makers usually affect decision results since any decision is closely linked to the quality and availability of information. In this paper, a PROMETHEE-BELIEF approach is proposed to help multi-criteria decisions based on incomplete information. This approach solves problems with incomplete decision matrix and unknown weights within PROMETHEE method. On the base of belief function theory, our approach first determines the distributions of belief masses based on PROMETHEE’s net flows and then calculates weights. Subsequently, it aggregates the distribution masses associated to each criterion using Murphy’s modified combination rule in order to infer a global belief structure. The final action ranking is obtained via pignistic probability transformation. A case study of real-world application concerning the location of a waste treatment center from healthcare activities with infectious risk in the center of Tunisia is studied to illustrate the detailed process of the BELIEF-PROMETHEE approach.

Keywords: belief function theory, incomplete information, multiple criteria analysis, PROMETHEE method

Procedia PDF Downloads 128
270 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: missing values, incomplete data, distance, incomplete diabetes data

Procedia PDF Downloads 183
269 Managing Incomplete PSA Observations in Prostate Cancer Data: Key Strategies and Best Practices for Handling Loss to Follow-Up and Missing Data

Authors: Madiha Liaqat, Rehan Ahmed Khan, Shahid Kamal

Abstract:

Multiple imputation with delta adjustment is a versatile and transparent technique for addressing univariate missing data in the presence of various missing mechanisms. This approach allows for the exploration of sensitivity to the missing-at-random (MAR) assumption. In this review, we outline the delta-adjustment procedure and illustrate its application for assessing the sensitivity to deviations from the MAR assumption. By examining diverse missingness scenarios and conducting sensitivity analyses, we gain valuable insights into the implications of missing data on our analyses, enhancing the reliability of our study's conclusions. In our study, we focused on assessing logPSA, a continuous biomarker in incomplete prostate cancer data, to examine the robustness of conclusions against plausible departures from the MAR assumption. We introduced several approaches for conducting sensitivity analyses, illustrating their application within the pattern mixture model (PMM) under the delta adjustment framework. This proposed approach effectively handles missing data, particularly loss to follow-up.

Keywords: loss to follow-up, incomplete response, multiple imputation, sensitivity analysis, prostate cancer

Procedia PDF Downloads 47
268 Electrochemical Regeneration of GIC Adsorbent in a Continuous Electrochemical Reactor

Authors: S. N. Hussain, H. M. A. Asghar, H. Sattar, E. P. L. Roberts

Abstract:

Arvia™ introduced a novel technology consisting of adsorption followed by electrochemical regeneration with a graphite intercalation compound adsorbent that takes place in a single unit. The adsorbed species may lead to the formation of intermediate by-products products due to incomplete mineralization during electrochemical regeneration. Therefore, the investigation of breakdown products due to incomplete oxidation is of great concern regarding the commercial applications of this process. In the present paper, the formation of the chlorinated breakdown products during continuous process of adsorption and electrochemical regeneration based on a graphite intercalation compound adsorbent has been investigated.

Keywords: GIC, adsorption, electrochemical regeneration, chlorphenols

Procedia PDF Downloads 268
267 The Effect of Size and Tumor Depth on Histological Clearance Margins of Basal Cell Carcinomas

Authors: Martin Van, Mohammed Javed, Sarah Hemington-Gorse

Abstract:

Aim: Our aim was to determine the effect of size and tumor depth of basal cell carcinomas (BCCs) on surgical margin clearance. Methods: A retrospective study was conducted at the Welsh Centre for Burns and Plastic Surgery (WCBPS), Morriston Hospital between 1 Jan 2016 – 31 July 2016. Only patients with confirmed BCC on histopathological analysis were included. Patient data including anatomical region treated, lesion size, histopathological clearance margins and histological sub-types were recorded. An independent T-test was performed determine statistical significance. Results: A total of 228 BCCs were excised in 160 patients. Eleven lesions (4.8%) were incompletely excised. The nose area had the highest rate of incomplete excision. The mean diameter of incompletely excised lesions was 11.4mm vs 11.5mm in completely excised lesions (p=0.959) and the mean histological depth of incompletely excised lesions was 4.1mm vs. 2.5mm for completely excised BCCs (p < 0.05). Conclusions: BCC tumor depth of > 4.1 mm was associated with high rate of incomplete margin clearance. Hence, in prospective patients, a BCC tumor depth (>4 mm) on tissue biopsy should alert the surgeon of potentially higher risk of incomplete excision of lesion.

Keywords: basal cell carcinoma, excision margins, plastic surgery, treatment

Procedia PDF Downloads 205
266 [Keynote Talk]: Evidence Fusion in Decision Making

Authors: Mohammad Abdullah-Al-Wadud

Abstract:

In the current era of automation and artificial intelligence, different systems have been increasingly keeping on depending on decision-making capabilities of machines. Such systems/applications may range from simple classifiers to sophisticated surveillance systems based on traditional sensors and related equipment which are becoming more common in the internet of things (IoT) paradigm. However, the available data for such problems are usually imprecise and incomplete, which leads to uncertainty in decisions made based on traditional probability-based classifiers. This requires a robust fusion framework to combine the available information sources with some degree of certainty. The theory of evidence can provide with such a method for combining evidence from different (may be unreliable) sources/observers. This talk will address the employment of the Dempster-Shafer Theory of evidence in some practical applications.

Keywords: decision making, dempster-shafer theory, evidence fusion, incomplete data, uncertainty

Procedia PDF Downloads 387
265 A Review of Methods for Handling Missing Data in the Formof Dropouts in Longitudinal Clinical Trials

Authors: A. Satty, H. Mwambi

Abstract:

Much clinical trials data-based research are characterized by the unavoidable problem of dropout as a result of missing or erroneous values. This paper aims to review some of the various techniques to address the dropout problems in longitudinal clinical trials. The fundamental concepts of the patterns and mechanisms of dropout are discussed. This study presents five general techniques for handling dropout: (1) Deletion methods; (2) Imputation-based methods; (3) Data augmentation methods; (4) Likelihood-based methods; and (5) MNAR-based methods. Under each technique, several methods that are commonly used to deal with dropout are presented, including a review of the existing literature in which we examine the effectiveness of these methods in the analysis of incomplete data. Two application examples are presented to study the potential strengths or weaknesses of some of the methods under certain dropout mechanisms as well as to assess the sensitivity of the modelling assumptions.

Keywords: incomplete longitudinal clinical trials, missing at random (MAR), imputation, weighting methods, sensitivity analysis

Procedia PDF Downloads 377
264 Incomplete Existing Algebra to Support Mathematical Computations

Authors: Ranjit Biswas

Abstract:

The existing subject Algebra is incomplete to support mathematical computations being done by scientists of all areas: Mathematics, Physics, Statistics, Chemistry, Space Science, Cosmology etc. even starting from the era of great Einstein. A huge hidden gap in the subject ‘Algebra’ is unearthed. All the scientists today, including mathematicians, physicists, chemists, statisticians, cosmologists, space scientists, and economists, even starting from the great Einstein, are lucky that they got results without facing any contradictions or without facing computational errors. Most surprising is that the results of all scientists, including Nobel Prize winners, were proved by them by doing experiments too. But in this paper, it is rigorously justified that they all are lucky. An algebraist can define an infinite number of new algebraic structures. The objective of the work in this paper is not just for the sake of defining a distinct algebraic structure, but to recognize and identify a major gap of the subject ‘Algebra’ lying hidden so far in the existing vast literature of it. The objective of this work is to fix the unearthed gap. Consequently, a different algebraic structure called ‘Region’ has been introduced, and its properties are studied.

Keywords: region, ROR, RORR, region algebra

Procedia PDF Downloads 6
263 Rehabilitation Robot in Primary Walking Pattern Training for SCI Patient at Home

Authors: Taisuke Sakaki, Toshihiko Shimokawa, Nobuhiro Ushimi, Koji Murakami, Yong-Kwun Lee, Kazuhiro Tsuruta, Kanta Aoki, Kaoru Fujiie, Ryuji Katamoto, Atsushi Sugyo

Abstract:

Recently attention has been focused on incomplete spinal cord injuries (SCI) to the central spine caused by pressure on parts of the white matter conduction pathway, such as the pyramidal tract. In this paper, we focus on a training robot designed to assist with primary walking-pattern training. The target patient for this training robot is relearning the basic functions of the usual walking pattern; it is meant especially for those with incomplete-type SCI to the central spine, who are capable of standing by themselves but not of performing walking motions. From the perspective of human engineering, we monitored the operator’s actions to the robot and investigated the movement of joints of the lower extremities, the circumference of the lower extremities, and exercise intensity with the machine. The concept of the device was to provide mild training without any sudden changes in heart rate or blood pressure, which will be particularly useful for the elderly and disabled. The mechanism of the robot is modified to be simple and lightweight with the expectation that it will be used at home.

Keywords: training, rehabilitation, SCI patient, welfare, robot

Procedia PDF Downloads 394
262 Imputation of Incomplete Large-Scale Monitoring Count Data via Penalized Estimation

Authors: Mohamed Dakki, Genevieve Robin, Marie Suet, Abdeljebbar Qninba, Mohamed A. El Agbani, Asmâa Ouassou, Rhimou El Hamoumi, Hichem Azafzaf, Sami Rebah, Claudia Feltrup-Azafzaf, Nafouel Hamouda, Wed a.L. Ibrahim, Hosni H. Asran, Amr A. Elhady, Haitham Ibrahim, Khaled Etayeb, Essam Bouras, Almokhtar Saied, Ashrof Glidan, Bakar M. Habib, Mohamed S. Sayoud, Nadjiba Bendjedda, Laura Dami, Clemence Deschamps, Elie Gaget, Jean-Yves Mondain-Monval, Pierre Defos Du Rau

Abstract:

In biodiversity monitoring, large datasets are becoming more and more widely available and are increasingly used globally to estimate species trends and con- servation status. These large-scale datasets challenge existing statistical analysis methods, many of which are not adapted to their size, incompleteness and heterogeneity. The development of scalable methods to impute missing data in incomplete large-scale monitoring datasets is crucial to balance sampling in time or space and thus better inform conservation policies. We developed a new method based on penalized Poisson models to impute and analyse incomplete monitoring data in a large-scale framework. The method al- lows parameterization of (a) space and time factors, (b) the main effects of predic- tor covariates, as well as (c) space–time interactions. It also benefits from robust statistical and computational capability in large-scale settings. The method was tested extensively on both simulated and real-life waterbird data, with the findings revealing that it outperforms six existing methods in terms of missing data imputation errors. Applying the method to 16 waterbird species, we estimated their long-term trends for the first time at the entire North African scale, a region where monitoring data suffer from many gaps in space and time series. This new approach opens promising perspectives to increase the accuracy of species-abundance trend estimations. We made it freely available in the r package ‘lori’ (https://CRAN.R-project.org/package=lori) and recommend its use for large- scale count data, particularly in citizen science monitoring programmes.

Keywords: biodiversity monitoring, high-dimensional statistics, incomplete count data, missing data imputation, waterbird trends in North-Africa

Procedia PDF Downloads 112
261 Ponticuli of Atlas Vertebra: A Study in South Coastal Region of Andhra Pradesh

Authors: Hema Lattupalli

Abstract:

Introduction: A bony bridge extends from the lateral mass of the atlas to postero medial margin of vertebral artery groove, termed as a posterior bridge of atlas or posterior ponticulus. The foramen formed by the bridge is called as arcuate foramen or retroarticulare superior. Another bony bridge sometimes extends laterally from lateral mass to posterior root of transverse foramen forming and additional groove for vertebral artery, above and behind foramen transversarium called Lateral bridge or ponticulus lateralis. When both posterior and lateral are present together it is called as Posterolateral ponticuli. Aim and Objectives: The aim of the present study is to detect the presence of such Bridge or Ponticuli called as Lateral, Posterior and Posterolateral reported by earlier investigators in atlas vertebrae. Material and Methods: The study was done on 100 Atlas vertebrae from the Department of Anatomy Narayana Medical College Nellore, and also from SVIMS Tirupati was collected over a period of 2 years. The parameters that were studied include the presence of ponticuli, complete and incomplete and right and left side ponticuli. They were observed for all these parameters and the results were documented and photographed. Results: Ponticuli were observed in 25 (25%) of atlas vertebrae. Posterior ponticuli were found in 16 (16%), Lateral in 01 (01%) and Posterolateral in 08(08%) of the atlas vertebrae. Complete ponticuli were present in 09 (09%) and incomplete ponticuli in 16 (16%) of the atlas vertebrae. Bilateral ponticuli were seen in 10 (10%) and unilateral ponticuli were seen in 15 (15%) of the atlas vertebrae. Right side ponticuli were seen in 04 (04%) and Left side ponticuli in 05 (05%) of the atlas vertebrae respectively. Interpretation and Conclusion: In the present study posterior complete ponticuli were said to be more than the lateral complete ponticuli. The presence of Bilateral Incomplete Posterior ponticuli is higher and also Atlantic ponticuli. The present study is to say that knowledge of normal anatomy and variations in the atlas vertebra is very much essential to the neurosurgeons giving a message that utmost care is needed to perform surgeries related to craniovertebral regions. This is additional information to the Anatomists, Neurosurgeons and Radiologist. This adds an extra page to the literature.

Keywords: atlas vertebra, ponticuli, posterior arch, arcuate foramen

Procedia PDF Downloads 337
260 A Multivariate 4/2 Stochastic Covariance Model: Properties and Applications to Portfolio Decisions

Authors: Yuyang Cheng, Marcos Escobar-Anel

Abstract:

This paper introduces a multivariate 4/2 stochastic covariance process generalizing the one-dimensional counterparts presented in Grasselli (2017). Our construction permits stochastic correlation not only among stocks but also among volatilities, also known as co-volatility movements, both driven by more convenient 4/2 stochastic structures. The parametrization is flexible enough to separate these types of correlation, permitting their individual study. Conditions for proper changes of measure and closed-form characteristic functions under risk-neutral and historical measures are provided, allowing for applications of the model to risk management and derivative pricing. We apply the model to an expected utility theory problem in incomplete markets. Our analysis leads to closed-form solutions for the optimal allocation and value function. Conditions are provided for well-defined solutions together with a verification theorem. Our numerical analysis highlights and separates the impact of key statistics on equity portfolio decisions, in particular, volatility, correlation, and co-volatility movements, with the latter being the least important in an incomplete market.

Keywords: stochastic covariance process, 4/2 stochastic volatility model, stochastic co-volatility movements, characteristic function, expected utility theory, veri cation theorem

Procedia PDF Downloads 116
259 Influence of Processing Parameters on the Reliability of Sieving as a Particle Size Distribution Measurements

Authors: Eseldin Keleb

Abstract:

In the pharmaceutical industry particle size distribution is an important parameter for the characterization of pharmaceutical powders. The powder flowability, reactivity and compatibility, which have a decisive impact on the final product, are determined by particle size and size distribution. Therefore, the aim of this study was to evaluate the influence of processing parameters on the particle size distribution measurements. Different Size fractions of α-lactose monohydrate and 5% polyvinylpyrrolidone were prepared by wet granulation and were used for the preparation of samples. The influence of sieve load (50, 100, 150, 200, 250, 300, and 350 g), processing time (5, 10, and 15 min), sample size ratios (high percentage of small and large particles), type of disturbances (vibration and shaking) and process reproducibility have been investigated. Results obtained showed that a sieve load of 50 g produce the best separation, a further increase in sample weight resulted in incomplete separation even after the extension of the processing time for 15 min. Performing sieving using vibration was rapider and more efficient than shaking. Meanwhile between day reproducibility showed that particle size distribution measurements are reproducible. However, for samples containing 70% fines or 70% large particles, which processed at optimized parameters, the incomplete separation was always observed. These results indicated that sieving reliability is highly influenced by the particle size distribution of the sample and care must be taken for samples with particle size distribution skewness.

Keywords: sieving, reliability, particle size distribution, processing parameters

Procedia PDF Downloads 574
258 Expand Rabies Post-Exposure Prophylaxis to Where It Is Needed the Most

Authors: Henry Wilde, Thiravat Hemachudha

Abstract:

Human rabies deaths are underreported worldwide at 55,000 annual cases; more than of dengue and Japanese encephalitis. Almost half are children. A recent study from the Philippines of nearly 2,000 rabies deaths revealed that none of had received incomplete or no post exposure prophylaxis. Coming from a canine rabies endemic country, this is not unique. There are two major barriers to reducing human rabies deaths: 1) the large number of unvaccinated dogs and 2) post-exposure prophylaxis (PEP) that is not available, incomplete, not affordable, or not within reach for bite victims travel means. Only the first barrier, inadequate vaccination of dogs, is now being seriously addressed. It is also often not done effectively or sustainably. Rabies PEP has evolved as a complex, prolonged process, usually delegated to centers in larger cities. It is virtually unavailable in villages or small communities where most dog bites occur, victims are poor and usually unable to travel a long distance multiple times to receive PEP. Reseacrh that led to better understanding of the pathophysiology of rabies and immune responses to potent vaccines and immunoglobulin have allowed shortening and making PEP more evidence based. This knowledge needs to be adopted and applied so that PEP can be rendered safely and affordably where needed the most: by village health care workers who have long performed more complex services after appropriate training. Recent research makes this an important and long neglected goal that is now within our means to implement.

Keywords: rabies, post-exposure prophylaxis, availability, immunoglobulin

Procedia PDF Downloads 232
257 Molecular Alterations Shed Light on Alteration of Methionine Metabolism in Gastric Intestinal Metaplesia; Insight for Treatment Approach

Authors: Nigatu Tadesse, Ying Liu, Juan Li, Hong Ming Liu

Abstract:

Gastric carcinogenesis is a lengthy process of histopathological transition from normal to atrophic gastritis (AG) to intestinal metaplasia (GIM), dysplasia toward gastric cancer (GC). The stage of GIM identified as precancerous lesions with resistance to H-pylori eradication and recurrence after endoscopic surgical resection therapies. GIM divided in to two morphologically distinct phenotypes such as complete GIM bearing intestinal type morphology whereas the incomplete type has colonic type morphology. The incomplete type GIM considered to be the greatest risk factor for the development of GC. Studies indicated the expression of the caudal type homeobox 2 (CDX2) gene is responsible for the development of complete GIM but its progressive downregulation from incomplete metaplasia toward advanced GC identified as the risk for IM progression and neoplastic transformation. The downregulation of CDX2 gene have promoted cell growth and proliferation in gastric and colon cancers and ascribed in chemo-treatment inefficacies. CDX2 downregulated through promoter region hypermethylation in which the methylation frequency positively correlated with the dietary history of the patients, suggesting the role of diet as methyl carbon donor sources such as methionine. However, the metabolism of exogenous methionine is yet unclear. Targeting exogenous methionine metabolism has become a promising approach to limits tumor cell growth, proliferation and progression and increase treatment outcome. This review article discusses molecular alterations that could shed light on the potential of exogenous methionine metabolisms, such as gut microbiota alteration as sources of methionine to host cells, metabolic pathway signaling via PI3K/AKt/mTORC1-c-MYC to rewire exogenous methionine and signature of increased gene methylation index, cell growth and proliferation in GIM, with insights to new treatment avenue via targeting methionine metabolism, and the need for future integrated studies on molecular alterations and metabolomics to uncover altered methionine metabolism and characterization of CDX2 methylation in gastric intestinal metaplasia for potential therapeutic exploitation.

Keywords: altered methionine metabolism, Intestinal metaplesia, CDX2 gene, gastric cancer

Procedia PDF Downloads 16
256 Structural Damage Detection via Incomplete Model Data Using Output Data Only

Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan

Abstract:

Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.

Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation

Procedia PDF Downloads 328
255 An Audit of Local Guidance Compliance For Stereotactic Core Biopsy For DCIS In The Breast Screening Programme

Authors: Aisling Eves, Andrew Pieri, Ross McLean, Nerys Forester

Abstract:

Background: The breast unit local guideline recommends that 12 cores should be used in a stereotactic-guided biopsy to diagnose DCIS. Twelve cores are regarded to provide good diagnostic value without removing more breast tissue than necessary. This study aimed to determine compliance with guidelines and investigated how the number of cores impacted upon the re-excision rate and size discrepancies. Methods: This single-centre retrospective cohort study of 72 consecutive breast screened patients with <15mm DCIS on radiological report underwent stereotactic-guided core biopsy and subsequent surgical excision. Clinical, radiological, and histological data were collected over 5 years, and ASCO guidelines for margin involvement of <2mm was used to guide the need for re-excision. Results: Forty-six (63.9%) patients had <12 cores taken, and 26 (36.1%) patients had ≥12 cores taken. Only six (8.3%) patients had 12 cores taken in their stereotactic biopsy. Incomplete surgical excision was seen in 17 patients overall (23.6%), and of these patients, twelve (70.6%) had fewer than 12 cores taken (p=0.55 for the difference between groups). Mammogram and biopsy underestimated the size of the DCIS in this subgroup by a median of 15mm (range: 6-135mm). Re-excision was required in 9 patients (12.5%), and five patients (6.9%) were found to have invasive ductal carcinoma on excision (80% had <12 cores, p=0.43). Discussion: There is poor compliance with the breast unit local guidelines and higher rates of re-excision in patients who did not have ≥12 cores taken. Taking ≥12 cores resulted in fewer missed invasive cancers lower incomplete excision and re-excision rates.

Keywords: stereotactic core biopsy, DCIS, breast screening, Re-excision rates, core biopsy

Procedia PDF Downloads 91
254 Examining the Skills of Establishing Number and Space Relations of Science Students with the 'Integrative Perception Test'

Authors: Ni̇sa Yeni̇kalayci, Türkan Aybi̇ke Akarca

Abstract:

The ability of correlation the number and space relations, one of the basic scientific process skills, is being used in the transformation of a two-dimensional object into a three-dimensional image or in the expression of symmetry axes of the object. With this research, it is aimed to determine the ability of science students to establish number and space relations. The research was carried out with a total of 90 students studying in the first semester of the Science Education program of a state university located in the Turkey’s Black Sea Region in the fall semester of 2017-2018 academic year. An ‘Integrative Perception Test (IPT)’ was designed by the researchers to collect the data. Within the scope of IPT, the courses and workbooks specific to the field of science were scanned and the ones without symmetrical structure from the visual items belonging to the ‘Physics - Chemistry – Biology’ sub-fields were selected and listed. During the application, it was expected that students would imagine and draw images of the missing half of the visual items that were given incomplete in the first place. The data obtained from the test in which there are 30 images or pictures in total (f Physics = 10, f Chemistry = 10, f Biology = 10) were analyzed descriptively based on the drawings created by the students as ‘complete (2 points), incomplete/wrong (1 point), empty (0 point)’. For the teaching of new concepts in small aged groups, images or pictures showing symmetrical structures and similar applications can also be used.

Keywords: integrative perception, number and space relations, science education, scientific process skills

Procedia PDF Downloads 120
253 Heat Transfer and Diffusion Modelling

Authors: R. Whalley

Abstract:

The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.

Keywords: heat, transfer, diffusion, modelling, computation

Procedia PDF Downloads 517
252 Rapid Assessment the Ability of Forest Vegetation in Kulonprogo to Store Carbon Using Multispectral Satellite Imagery and Vegetation Index

Authors: Ima Rahmawati, Nur Hafizul Kalam

Abstract:

Development of industrial and economic sectors in various countries very rapidly caused raising the greenhouse gas (GHG) emissions. Greenhouse gases are dominated by carbon dioxide (CO2) and methane (CH4) in the atmosphere that make the surface temperature of the earth always increase. The increasing gases caused by incomplete combustion of fossil fuels such as petroleum and coals and also high rate of deforestation. Yogyakarta Special Province which every year always become tourist destination, has a great potency in increasing of greenhouse gas emissions mainly from the incomplete combustion. One of effort to reduce the concentration of gases in the atmosphere is keeping and empowering the existing forests in the Province of Yogyakarta, especially forest in Kulonprogro is to be maintained the greenness so that it can absorb and store carbon maximally. Remote sensing technology can be used to determine the ability of forests to absorb carbon and it is connected to the density of vegetation. The purpose of this study is to determine the density of the biomass of forest vegetation and determine the ability of forests to store carbon through Photo-interpretation and Geographic Information System approach. Remote sensing imagery that used in this study is LANDSAT 8 OLI year 2015 recording. LANDSAT 8 OLI imagery has 30 meters spatial resolution for multispectral bands and it can give general overview the condition of the carbon stored from every density of existing vegetation. The method is the transformation of vegetation index combined with allometric calculation of field data then doing regression analysis. The results are model maps of density and capability level of forest vegetation in Kulonprogro, Yogyakarta in storing carbon.

Keywords: remote sensing, carbon, kulonprogo, forest vegetation, vegetation index

Procedia PDF Downloads 358
251 A Systematic Review of the Methodological and Reporting Quality of Case Series in Surgery

Authors: Riaz A. Agha, Alexander J. Fowler, Seon-Young Lee, Buket Gundogan, Katharine Whitehurst, Harkiran K. Sagoo, Kyung Jin Lee Jeong, Douglas G. Altman, Dennis P. Orgill

Abstract:

Introduction: Case Series are an important and common study type. Currently, no guideline exists for reporting case series and there is evidence of key data being missed from such reports. We propose to develop a reporting guideline for case series using a methodologically robust technique. The first step in this process is a systematic review of literature relevant to the reporting deficiencies of case series. Methods: A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, EMBASE, Cochrane Methods Register, Science Citation index and Conference Proceedings Citation index, from the start of indexing until 5th November 2014. Independent screening, eligibility assessments and data extraction was performed. Included articles were analyzed for five areas of deficiency: failure to use standardized definitions missing or selective data transparency or incomplete reporting whether alternate study designs were considered. Results: The database searching identified 2,205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequency of methodological and reporting issues identified was a failure to use standardized definitions (57%), missing or selective data (66%), transparency, or incomplete reporting (70%), whether alternate study designs were considered (11%) and other issues (52%). Conclusion: The methodological and reporting quality of surgical case series needs improvement. Our data shows that clear evidence-based guidelines for the conduct and reporting of a case series may be useful to those planning or conducting them.

Keywords: case series, reporting quality, surgery, systematic review

Procedia PDF Downloads 330
250 The Various Legal Dimensions of Genomic Data

Authors: Amy Gooden

Abstract:

When human genomic data is considered, this is often done through only one dimension of the law, or the interplay between the various dimensions is not considered, thus providing an incomplete picture of the legal framework. This research considers and analyzes the various dimensions in South African law applicable to genomic sequence data – including property rights, personality rights, and intellectual property rights. The effective use of personal genomic sequence data requires the acknowledgement and harmonization of the rights applicable to such data.

Keywords: artificial intelligence, data, law, genomics, rights

Procedia PDF Downloads 105
249 Investigation of Performance of Organic Acids on Carbonate Rocks (Experimental Study in Ahwaz Oilfield)

Authors: Azad Jarrahian, Ehsan Heidaryan

Abstract:

Matrix acidizing treatments can yield impressive production increase if properly applied. In this study, carbonate samples taken from Ahwaz Oilfield have undergone static solubility, sludge, emulsion, and core flooding tests. In each test interaction of acid and rock is reported and at the end it has been shown that how initial permeability and type of acid affects the overall treatment efficiency.

Keywords: carbonate acidizing, organic acids, spending rate, acid penetration, incomplete spending.

Procedia PDF Downloads 393
248 A Machine Learning Model for Dynamic Prediction of Chronic Kidney Disease Risk Using Laboratory Data, Non-Laboratory Data, and Metabolic Indices

Authors: Amadou Wurry Jallow, Adama N. S. Bah, Karamo Bah, Shih-Ye Wang, Kuo-Chung Chu, Chien-Yeh Hsu

Abstract:

Chronic kidney disease (CKD) is a major public health challenge with high prevalence, rising incidence, and serious adverse consequences. Developing effective risk prediction models is a cost-effective approach to predicting and preventing complications of chronic kidney disease (CKD). This study aimed to develop an accurate machine learning model that can dynamically identify individuals at risk of CKD using various kinds of diagnostic data, with or without laboratory data, at different follow-up points. Creatinine is a key component used to predict CKD. These models will enable affordable and effective screening for CKD even with incomplete patient data, such as the absence of creatinine testing. This retrospective cohort study included data on 19,429 adults provided by a private research institute and screening laboratory in Taiwan, gathered between 2001 and 2015. Univariate Cox proportional hazard regression analyses were performed to determine the variables with high prognostic values for predicting CKD. We then identified interacting variables and grouped them according to diagnostic data categories. Our models used three types of data gathered at three points in time: non-laboratory, laboratory, and metabolic indices data. Next, we used subgroups of variables within each category to train two machine learning models (Random Forest and XGBoost). Our machine learning models can dynamically discriminate individuals at risk for developing CKD. All the models performed well using all three kinds of data, with or without laboratory data. Using only non-laboratory-based data (such as age, sex, body mass index (BMI), and waist circumference), both models predict chronic kidney disease as accurately as models using laboratory and metabolic indices data. Our machine learning models have demonstrated the use of different categories of diagnostic data for CKD prediction, with or without laboratory data. The machine learning models are simple to use and flexible because they work even with incomplete data and can be applied in any clinical setting, including settings where laboratory data is difficult to obtain.

Keywords: chronic kidney disease, glomerular filtration rate, creatinine, novel metabolic indices, machine learning, risk prediction

Procedia PDF Downloads 65
247 Recognition of Tifinagh Characters with Missing Parts Using Neural Network

Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui

Abstract:

In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.

Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN

Procedia PDF Downloads 300
246 Comparing the Apparent Error Rate of Gender Specifying from Human Skeletal Remains by Using Classification and Cluster Methods

Authors: Jularat Chumnaul

Abstract:

In forensic science, corpses from various homicides are different; there are both complete and incomplete, depending on causes of death or forms of homicide. For example, some corpses are cut into pieces, some are camouflaged by dumping into the river, some are buried, some are burned to destroy the evidence, and others. If the corpses are incomplete, it can lead to the difficulty of personally identifying because some tissues and bones are destroyed. To specify gender of the corpses from skeletal remains, the most precise method is DNA identification. However, this method is costly and takes longer so that other identification techniques are used instead. The first technique that is widely used is considering the features of bones. In general, an evidence from the corpses such as some pieces of bones, especially the skull and pelvis can be used to identify their gender. To use this technique, forensic scientists are required observation skills in order to classify the difference between male and female bones. Although this technique is uncomplicated, saving time and cost, and the forensic scientists can fairly accurately determine gender by using this technique (apparently an accuracy rate of 90% or more), the crucial disadvantage is there are only some positions of skeleton that can be used to specify gender such as supraorbital ridge, nuchal crest, temporal lobe, mandible, and chin. Therefore, the skeletal remains that will be used have to be complete. The other technique that is widely used for gender specifying in forensic science and archeology is skeletal measurements. The advantage of this method is it can be used in several positions in one piece of bones, and it can be used even if the bones are not complete. In this study, the classification and cluster analysis are applied to this technique, including the Kth Nearest Neighbor Classification, Classification Tree, Ward Linkage Cluster, K-mean Cluster, and Two Step Cluster. The data contains 507 particular individuals and 9 skeletal measurements (diameter measurements), and the performance of five methods are investigated by considering the apparent error rate (APER). The results from this study indicate that the Two Step Cluster and Kth Nearest Neighbor method seem to be suitable to specify gender from human skeletal remains because both yield small apparent error rate of 0.20% and 4.14%, respectively. On the other hand, the Classification Tree, Ward Linkage Cluster, and K-mean Cluster method are not appropriate since they yield large apparent error rate of 10.65%, 10.65%, and 16.37%, respectively. However, there are other ways to evaluate the performance of classification such as an estimate of the error rate using the holdout procedure or misclassification costs, and the difference methods can make the different conclusions.

Keywords: skeletal measurements, classification, cluster, apparent error rate

Procedia PDF Downloads 221