Search results for: diagnostic binary ratio
6208 Performance Comparison of Non-Binary RA and QC-LDPC Codes
Abstract:
Repeat–Accumulate (RA) codes are subclass of LDPC codes with fast encoder structures. In this paper, we consider a nonbinary extension of binary LDPC codes over GF(q) and construct a non-binary RA code and a non-binary QC-LDPC code over GF(2^4), we construct non-binary RA codes with linear encoding method and non-binary QC-LDPC codes with algebraic constructions method. And the BER performance of RA and QC-LDPC codes over GF(q) are compared with BP decoding and by simulation over the Additive White Gaussian Noise (AWGN) channels.Keywords: non-binary RA codes, QC-LDPC codes, performance comparison, BP algorithm
Procedia PDF Downloads 3796207 Teaching the Binary System via Beautiful Facts from the Real Life
Authors: Salem Ben Said
Abstract:
In recent times the decimal number system to which we are accustomed has received serious competition from the binary number system. In this note, an approach is suggested to teaching and learning the binary number system using examples from the real world. More precisely, we will demonstrate the utility of the binary system in describing the optimal strategy to win the Chinese Nim game, and in telegraphy by decoding the hidden message on Perseverance’s Mars parachute written in the language of binary system. Finally, we will answer the question, “why do modern computers prefer the ternary number system instead of the binary system?”. All materials are provided in a format that is conductive to classroom presentation and discussion.Keywords: binary number system, Nim game, telegraphy, computers prefer the ternary system
Procedia PDF Downloads 1916206 Trends And Source Identification Of Polycyclic Aromatic Hydrocarbons (Pahs) In Five-size Particulate Matter In The Nakhon Ratchasima Province, Thailand
Authors: Woranuch Deelaman, Chomsri Choochuay, Siwatt Pongpiachan, Danai Tipmanee
Abstract:
In this work, perform an analysis to identify the source of PAHs in particulate matter (PM). The five sizes are 0.1, 0.5, 1, 2.5, and 10 microns from Nakhon Ratchasima province, Thailand. Nakhon Ratchasima is a province in the northeastern part of Thailand. It has an area of 20,493 square kilometers and a forest area of 2,297,735 rai, making it the second largest area in the country. The major economies of Nakhon Ratchasima Province have an important structure, including the industrial sector. The agricultural sector and wholesale and retail trade accounted for 22.46 percent, 19.82 percent, and 14.91 percent, respectively. This study, we collected particulates using the Nano-sampler II sampling tool for a month. PM samples (n = 20) were collected in Tambon Suranari (14°52'05.6"N, 102°00'31.8"E) is a sub-district located in the Mueang district of Nakhon Ratchasima province. It is an important area consisting of community sites, educational institutions, universities, hospitals, religious places, and industrial areas. The samples collected from November 1, 2024 to November 30, 2024. Then, the PM samples were wrapped with aluminium foil and stored at −4 °C until the analysis.The PAHs were chemically extracted for eight hours using a Soxhlet extractor and internal standards (deuterated-fluorene (d10-Fl): phenanthrene, anthracene, fluoranthene, pyrene, 11 H-benzo[a]fluorene, 11 H-benzo[b]fluorene, chrysene; deuterated-perylene (d12-Per): benzo[b]fluoranthene, benzo[k]fluoranthene, benzo[a] pyrene, benzo[e]pyrene, indeno[1,2,3-cd]pyrene, dibenz[a,h]anthracene, and benzo[g,h,i]perylene using DCM as a solvent, and then analyzed 15 PAHs from the PM using a gas chromatograph-mass spectrometer (Shimadzu GCMS-QP, 2010 Ultra) in the selective ion monitoring mode. The sources of PAHs in Nakhon Ratchasima particulate matter were determined using a combination of multivariate descriptive statistics and diagnostic binary ratios of PAHs. The source of PAHs in particulate matter was identified using five diagnostic binary ratios of PAH isomer pairs: An/(An + Phe), Fluo/(Fluo + Pyr), B[a]A/(B[a]A + Chry), Ind/(Ind + B[g,h,i]P), and B[a]P/B[g,h,i]P. According to the diagnostic ratio, the majority of the PAHs found in the particulate matter samples came from pyrogenic sources, which include incomplete burning of biomass and petroleum. Additionally, multivariate descriptive statistics (principal components analysis (PCA)) were used to identify the source of 15 PAHs in the Nakhon Ratchasima sample of particulate matter. The results of the PCA identification of aromatic polycyclic hydrocarbons in PM show that incomplete combustion from the use of fuel is also a major source of aromatic polycyclic hydrocarbons in Nakhon Ratchasima province. In the future, we anticipate that the study will help the environmental planning management of Thailand's Nakhon Ratchasima province and many other nations.Keywords: polycyclic aromatic hydrocarbons (PAHs), particulate matter (PM), diagnostic binary ratio, source apportionment
Procedia PDF Downloads 16205 Hit-Or-Miss Transform as a Tool for Similar Shape Detection
Authors: Osama Mohamed Elrajubi, Idris El-Feghi, Mohamed Abu Baker Saghayer
Abstract:
This paper describes an identification of specific shapes within binary images using the morphological Hit-or-Miss Transform (HMT). Hit-or-Miss transform is a general binary morphological operation that can be used in searching of particular patterns of foreground and background pixels in an image. It is actually a basic operation of binary morphology since almost all other binary morphological operators are derived from it. The input of this method is a binary image and a structuring element (a template which will be searched in a binary image) while the output is another binary image. In this paper a modification of Hit-or-Miss transform has been proposed. The accuracy of algorithm is adjusted according to the similarity of the template and the sought template. The implementation of this method has been done by C language. The algorithm has been tested on several images and the results have shown that this new method can be used for similar shape detection.Keywords: hit-or-miss operator transform, HMT, binary morphological operation, shape detection, binary images processing
Procedia PDF Downloads 3386204 Spectrophotometric Methods for Simultaneous Determination of Binary Mixture of Amlodipine Besylate and Atenolol Based on Dual Wavelength
Authors: Nesrine T. Lamie
Abstract:
Four, accurate, precise, and sensitive spectrophotometric methods are developed for the simultaneous determination of a binary mixture containing amlodipine besylate (AM) and atenolol (AT) where AM is determined at its λmax 360 nm (0D), while atenolol can be determined by different methods. Method (A) is absorpotion factor (AFM). Method (B) is the new Ratio Difference method(RD) which measures the difference in amplitudes between 210 and 226 nm of ratio spectrum., Method (C) is novel constant center spectrophotometric method (CC) Method (D) is mean centering of the ratio spectra (MCR) at 284 nm. The calibration curve is linear over the concentration range of 10–80 and 4–40 μg/ml for AM and AT, respectively. These methods are tested by analyzing synthetic mixtures of the cited drugs and they are applied to their commercial pharmaceutical preparation. The validity of results was assessed by applying standard addition technique. The results obtained were found to agree statistically with those obtained by a reported method, showing no significant difference with respect to accuracy and precision.Keywords: amlodipine, atenolol, absorption factor, constant center, mean centering, ratio difference
Procedia PDF Downloads 3086203 On the Construction of Some Optimal Binary Linear Codes
Authors: Skezeer John B. Paz, Ederlina G. Nocon
Abstract:
Finding an optimal binary linear code is a central problem in coding theory. A binary linear code C = [n, k, d] is called optimal if there is no linear code with higher minimum distance d given the length n and the dimension k. There are bounds giving limits for the minimum distance d of a linear code of fixed length n and dimension k. The lower bound which can be taken by construction process tells that there is a known linear code having this minimum distance. The upper bound is given by theoretic results such as Griesmer bound. One way to find an optimal binary linear code is to make the lower bound of d equal to its higher bound. That is, to construct a binary linear code which achieves the highest possible value of its minimum distance d, given n and k. Some optimal binary linear codes were presented by Andries Brouwer in his published table on bounds of the minimum distance d of binary linear codes for 1 ≤ n ≤ 256 and k ≤ n. This was further improved by Markus Grassl by giving a detailed construction process for each code exhibiting the lower bound. In this paper, we construct new optimal binary linear codes by using some construction processes on existing binary linear codes. Particularly, we developed an algorithm applied to the codes already constructed to extend the list of optimal binary linear codes up to 257 ≤ n ≤ 300 for k ≤ 7.Keywords: bounds of linear codes, Griesmer bound, construction of linear codes, optimal binary linear codes
Procedia PDF Downloads 7606202 Fault Diagnosis of Manufacturing Systems Using AntTreeStoch with Parameter Optimization by ACO
Authors: Ouahab Kadri, Leila Hayet Mouss
Abstract:
In this paper, we present three diagnostic modules for complex and dynamic systems. These modules are based on three ant colony algorithms, which are AntTreeStoch, Lumer & Faieta and Binary ant colony. We chose these algorithms for their simplicity and their wide application range. However, we cannot use these algorithms in their basement forms as they have several limitations. To use these algorithms in a diagnostic system, we have proposed three variants. We have tested these algorithms on datasets issued from two industrial systems, which are clinkering system and pasteurization system.Keywords: ant colony algorithms, complex and dynamic systems, diagnosis, classification, optimization
Procedia PDF Downloads 3046201 Soret-Driven Convection in a Binary Fluid with Coriolis Force
Authors: N. H. Z. Abidin, N. F. M. Mokhtar, S. S. A. Gani
Abstract:
The influence of diffusion of the thermal or known as Soret effect in a heated Binary fluid model with Coriolis force is investigated theoretically. The linear stability analysis is used, and the eigenvalue is obtained using the Galerkin method. The impact of the Soret and Coriolis force on the onset of stationary convection in a system is analysed with respect to various Binary fluid parameters and presented graphically. It is found that an increase of the Soret values, destabilize the Binary fluid layer system. However, elevating the values of the Coriolis force helps to lag the onset of convection in a system.Keywords: Benard convection, binary fluid, Coriolis, Soret
Procedia PDF Downloads 3886200 Evaluation of the Weight-Based and Fat-Based Indices in Relation to Basal Metabolic Rate-to-Weight Ratio
Authors: Orkide Donma, Mustafa M. Donma
Abstract:
Basal metabolic rate is questioned as a risk factor for weight gain. The relations between basal metabolic rate and body composition have not been cleared yet. The impact of fat mass on basal metabolic rate is also uncertain. Within this context, indices based upon total body mass as well as total body fat mass are available. In this study, the aim is to investigate the potential clinical utility of these indices in the adult population. 287 individuals, aged from 18 to 79 years, were included into the scope of the study. Based upon body mass index values, 10 underweight, 88 normal, 88 overweight, 81 obese, and 20 morbid obese individuals participated. Anthropometric measurements including height (m), and weight (kg) were performed. Body mass index, diagnostic obesity notation model assessment index I, diagnostic obesity notation model assessment index II, basal metabolic rate-to-weight ratio were calculated. Total body fat mass (kg), fat percent (%), basal metabolic rate, metabolic age, visceral adiposity, fat mass of upper as well as lower extremities and trunk, obesity degree were measured by TANITA body composition monitor using bioelectrical impedance analysis technology. Statistical evaluations were performed by statistical package (SPSS) for Windows Version 16.0. Scatterplots of individual measurements for the parameters concerning correlations were drawn. Linear regression lines were displayed. The statistical significance degree was accepted as p < 0.05. The strong correlations between body mass index and diagnostic obesity notation model assessment index I as well as diagnostic obesity notation model assessment index II were obtained (p < 0.001). A much stronger correlation was detected between basal metabolic rate and diagnostic obesity notation model assessment index I in comparison with that calculated for basal metabolic rate and body mass index (p < 0.001). Upon consideration of the associations between basal metabolic rate-to-weight ratio and these three indices, the best association was observed between basal metabolic rate-to-weight and diagnostic obesity notation model assessment index II. In a similar manner, this index was highly correlated with fat percent (p < 0.001). Being independent of the indices, a strong correlation was found between fat percent and basal metabolic rate-to-weight ratio (p < 0.001). Visceral adiposity was much strongly correlated with metabolic age when compared to that with chronological age (p < 0.001). In conclusion, all three indices were associated with metabolic age, but not with chronological age. Diagnostic obesity notation model assessment index II values were highly correlated with body mass index values throughout all ranges starting with underweight going towards morbid obesity. This index is the best in terms of its association with basal metabolic rate-to-weight ratio, which can be interpreted as basal metabolic rate unit.Keywords: basal metabolic rate, body mass index, children, diagnostic obesity notation model assessment index, obesity
Procedia PDF Downloads 1536199 Mining Diagnostic Investigation Process
Authors: Sohail Imran, Tariq Mahmood
Abstract:
In complex healthcare diagnostic investigation process, medical practitioners have to focus on ways to standardize their processes to perform high quality care and optimize the time and costs. Process mining techniques can be applied to extract process related knowledge from data without considering causal and dynamic dependencies in business domain and processes. The application of process mining is effective in diagnostic investigation. It is very helpful where a treatment gives no dispositive evidence favoring it. In this paper, we applied process mining to discover important process flow of diagnostic investigation for hepatitis patients. This approach has some benefits which can enhance the quality and efficiency of diagnostic investigation processes.Keywords: process mining, healthcare, diagnostic investigation process, process flow
Procedia PDF Downloads 5286198 Reconstruction of Binary Matrices Satisfying Neighborhood Constraints by Simulated Annealing
Authors: Divyesh Patel, Tanuja Srivastava
Abstract:
This paper considers the NP-hard problem of reconstructing binary matrices satisfying exactly-1-4-adjacency constraint from its row and column projections. This problem is formulated into a maximization problem. The objective function gives a measure of adjacency constraint for the binary matrices. The maximization problem is solved by the simulated annealing algorithm and experimental results are presented.Keywords: discrete tomography, exactly-1-4-adjacency, simulated annealing, binary matrices
Procedia PDF Downloads 4106197 Speech Enhancement Using Wavelet Coefficients Masking with Local Binary Patterns
Authors: Christian Arcos, Marley Vellasco, Abraham Alcaim
Abstract:
In this paper, we present a wavelet coefficients masking based on Local Binary Patterns (WLBP) approach to enhance the temporal spectra of the wavelet coefficients for speech enhancement. This technique exploits the wavelet denoising scheme, which splits the degraded speech into pyramidal subband components and extracts frequency information without losing temporal information. Speech enhancement in each high-frequency subband is performed by binary labels through the local binary pattern masking that encodes the ratio between the original value of each coefficient and the values of the neighbour coefficients. This approach enhances the high-frequency spectra of the wavelet transform instead of eliminating them through a threshold. A comparative analysis is carried out with conventional speech enhancement algorithms, demonstrating that the proposed technique achieves significant improvements in terms of PESQ, an international recommendation of objective measure for estimating subjective speech quality. Informal listening tests also show that the proposed method in an acoustic context improves the quality of speech, avoiding the annoying musical noise present in other speech enhancement techniques. Experimental results obtained with a DNN based speech recognizer in noisy environments corroborate the superiority of the proposed scheme in the robust speech recognition scenario.Keywords: binary labels, local binary patterns, mask, wavelet coefficients, speech enhancement, speech recognition
Procedia PDF Downloads 2326196 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface
Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari
Abstract:
With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis
Procedia PDF Downloads 4216195 Using Diagnostic Assessment as a Learning and Teaching Approach to Identify Learning Gaps at a Polytechnic
Authors: Vijayan Narayananayar
Abstract:
Identifying learning gaps is crucial in ensuring learners have the necessary knowledge and skills to succeed. The Learning and Teaching (L&T) approach requires tutors to identify gaps in knowledge and improvise learning activities to close them. One approach to identifying learning gaps is through diagnostic assessment, which uses well-structured questions and answer options. The paper focuses on the use of diagnostic assessment as a learning and teaching approach in a foundational module at a polytechnic. The study used diagnostic assessment over two semesters, including the COVID and post-COVID semesters, to identify gaps in learning. The design of the diagnostic activity, pedagogical intervention, and survey responses completed by learners were analyzed. Results showed that diagnostic assessment can be an effective tool for identifying learning gaps and designing interventions to address them. Additionally, the use of diagnostic assessment provides an opportunity for tutors to engage with learners on a one-to-one basis, tailoring teaching to individual needs. The paper also discusses the design of diagnostic questions and answer options, including characteristics that need to be considered in achieving the target of identifying learning gaps. The implications of using diagnostic assessment as a learning and teaching approach include bridging the gap between theory and practice, and ensuring learners are equipped with skills necessary for their future careers. This paper can be useful in helping educators and practitioners to incorporate diagnostic assessment into their L&T approach.Keywords: assessment, learning & teaching, diagnostic assessment, analytics
Procedia PDF Downloads 1186194 Theoretical and Experimental Investigations of Binary Systems for Hydrogen Storage
Authors: Gauthier Lefevre, Holger Kohlmann, Sebastien Saitzek, Rachel Desfeux, Adlane Sayede
Abstract:
Hydrogen is a promising energy carrier, compatible with the sustainable energy concept. In this context, solid-state hydrogen-storage is the key challenge in developing hydrogen economy. The capability of absorption of large quantities of hydrogen makes intermetallic systems of particular interest. In this study, efforts have been devoted to the theoretical investigation of binary systems with constraints consideration. On the one hand, besides considering hydrogen-storage, a reinvestigation of crystal structures of the palladium-arsenic system shows, with experimental validations, that binary systems could still currently present new or unknown relevant structures. On the other hand, various binary Mg-based systems were theoretically scrutinized in order to find new interesting alloys for hydrogen storage. Taking the effect of pressure into account reveals a wide range of alternative structures, changing radically the stable compounds of studied binary systems. Similar constraints, induced by Pulsed Laser Deposition, have been applied to binary systems, and results are presented.Keywords: binary systems, evolutionary algorithm, first principles study, pulsed laser deposition
Procedia PDF Downloads 2756193 One vs. Rest and Error Correcting Output Codes Principled Rebalancing Schemes for Solving Imbalanced Multiclass Problems
Authors: Alvaro Callejas-Ramos, Lorena Alvarez-Perez, Alexander Benitez-Buenache, Anibal R. Figueiras-Vidal
Abstract:
This contribution presents a promising formulation which allows to extend the principled binary rebalancing procedures, also known as neutral re-balancing mechanisms in the sense that they do not alter the likelihood ratioKeywords: Bregman divergences, imbalanced multiclass classifi-cation, informed re-balancing, invariant likelihood ratio
Procedia PDF Downloads 2216192 Comparison of the Response of TLD-100 and TLD-100H Dosimeters in Diagnostic Radiology
Authors: S. Sina, B. Zeinali, M. Karimipourfard, F. Lotfalizadeh, M. Sadeghi, E. Zamani, M. Zehtabian, R. Faghihi
Abstract:
Proper dosimetery is very essential in diagnostic radiology. The goal of this study is to verify the application of LiF:Mg, Cu, P (TLD100H) in obtaining the entrance skin dose (ESD) of patients undergoing diagnostic radiology. The results of dosimetry performed by TLD-100H were compared with those obtained by TLD100, which is a common dosimeter in diagnostic radiology. The results show a close agreement between the dose measured by the two dosimeters. According to the results of this study, the TLD-100H dosimeters have higher sensitivities (i.e. signal(nc)/dose) than TLD-100. Therefore, it is suggested that the TLD-100H are effective dosimeters for dosimetry in low dose fields.Keywords: entrance skin dose, TLD, diagnostic radiology, dosimeter
Procedia PDF Downloads 4786191 Aggregation of Fractal Aggregates Inside Fractal Cages in Irreversible Diffusion Limited Cluster Aggregation Binary Systems
Authors: Zakiya Shireen, Sujin B. Babu
Abstract:
Irreversible diffusion-limited cluster aggregation (DLCA) of binary sticky spheres was simulated by modifying the Brownian Cluster Dynamics (BCD). We randomly distribute N spheres in a 3D box of size L, the volume fraction is given by Φtot = (π/6)N/L³. We identify NA and NB number of spheres as species A and B in our system both having identical size. In these systems, both A and B particles undergo Brownian motion. Irreversible bond formation happens only between intra-species particles and inter-species interact only through hard-core repulsions. As we perform simulation using BCD we start to observe binary gels. In our study, we have observed that species B always percolate (cluster size equal to L) as expected for the monomeric case and species A does not percolate below a critical ratio which is different for different volume fractions. We will also show that the accessible volume of the system increases when compared to the monomeric case, which means that species A is aggregating inside the cage created by B. We have also observed that for moderate Φtot the system undergoes a transition from flocculation region to percolation region indicated by the change in fractal dimension from 1.8 to 2.5. For smaller ratio of A, it stays in the flocculation regime even though B have already crossed over to the percolation regime. Thus, we observe two fractal dimension in the same system.Keywords: BCD, fractals, percolation, sticky spheres
Procedia PDF Downloads 2866190 Cardiothoracic Ratio in Postmortem Computed Tomography: A Tool for the Diagnosis of Cardiomegaly
Authors: Alex Eldo Simon, Abhishek Yadav
Abstract:
This study aimed to evaluate the utility of postmortem computed tomography (CT) and heart weight measurements in the assessment of cardiomegaly in cases of sudden death due to cardiac origin by comparing the results of these two diagnostic methods. The study retrospectively analyzed postmortem computed tomography (PMCT) data from 54 cases of sudden natural death and compared the findings with those of the autopsy. The study involved measuring the cardiothoracic ratio (CTR) from coronal computed tomography (CT) images and determining the actual cardiac weight by weighing the heart during the autopsy. The inclusion criteria for the study were cases of sudden death suspected to be caused by cardiac pathology, while exclusion criteria included death due to unnatural causes such as trauma or poisoning, diagnosed natural causes of death related to organs other than the heart, and cases of decomposition. Sensitivity, specificity, and diagnostic accuracy were calculated, and to evaluate the accuracy of using the cardiothoracic ratio (CTR) to detect an enlarged heart, the study generated receiver operating characteristic (ROC) curves. The cardiothoracic ratio (CTR) is a radiological tool used to assess cardiomegaly by measuring the maximum cardiac diameter in relation to the maximum transverse diameter of the chest wall. The clinically used criteria for CTR have been modified from 0.50 to 0.57 for use in postmortem settings, where abnormalities can be detected by comparing CTR values to this threshold. A CTR value of 0.57 or higher is suggestive of hypertrophy but not conclusive. Similarly, heart weight is measured during the traditional autopsy, and a cardiac weight greater than 450 grams is defined as hypertrophy. Of the 54 cases evaluated, 22 (40.7%) had a cardiothoracic ratio (CTR) ranging from > 0.50 to equal 0.57, and 12 cases (22.2%) had a CTR greater than 0.57, which was defined as hypertrophy. The mean CTR was calculated as 0.52 ± 0.06. Among the 54 cases evaluated, the weight of the heart was measured, and the mean was calculated as 369.4 ± 99.9 grams. Out of the 54 cases evaluated, 12 were found to have hypertrophy as defined by PMCT, while only 9 cases were identified with hypertrophy in traditional autopsy. The sensitivity and specificity of the test were calculated as 55.56% and 84.44%, respectively. The sensitivity of the hypertrophy test was found to be 55.56% (95% CI: 26.66, 81.12¹), the specificity was 84.44% (95% CI: 71.22, 92.25¹), and the diagnostic accuracy was 79.63% (95% CI: 67.1, 88.23¹). The limitation of the study was a low sample size of only 54 cases, which may limit the generalizability of the findings. The comparison of the cardiothoracic ratio with heart weight in this study suggests that PMCT may serve as a screening tool for medico-legal autopsies when performed by forensic pathologists. However, it should be noted that the low sensitivity of the test (55.5%) may limit its diagnostic accuracy, and therefore, further studies with larger sample sizes and more diverse populations are needed to validate these findings.Keywords: PMCT, virtopsy, CTR, cardiothoracic ratio
Procedia PDF Downloads 846189 Effect of Carbon-Free Fly Ash and Ground Granulated Blast-Furnace Slag on Compressive Strength of Mortar under Different Curing Conditions
Authors: Abdul Khaliq Amiri, Shigeyuki Date
Abstract:
This study investigates the effect of using carbon-free fly ash (CfFA) and ground granulated blast-furnace slag (GGBFS) on the compressive strength of mortar. The CfFA used in this investigation is high-quality fly ash and the carbon content is 1.0% or less. In this study, three types of blends with a 30% water-binder ratio (w/b) were prepared: control, binary and ternary blends. The Control blend contained only Ordinary Portland Cement (OPC), in binary and ternary blends OPC was partially replaced with CfFA and GGBFS at different substitution rates. Mortar specimens were cured for 1 day, 7 days and 28 days under two curing conditions: steam curing and water curing. The steam cured specimens were exposed to two different pre-curing times (1.5 h and 2.5 h) and one steam curing duration (6 h) at 45 °C. The test results showed that water cured specimens revealed higher compressive strength than steam cured specimens at later ages. An increase in CfFA and GGBFS contents caused a decrease in the compressive strength of mortar. Ternary mixes exhibited better compressive strength than binary mixes containing CfFA with the same replacement ratio of mineral admixtures.Keywords: carbon-free fly ash, compressive strength, ground granulated blast-furnace slag, steam curing, water curing
Procedia PDF Downloads 1436188 Comparative Analysis of Dissimilarity Detection between Binary Images Based on Equivalency and Non-Equivalency of Image Inversion
Authors: Adnan A. Y. Mustafa
Abstract:
Image matching is a fundamental problem that arises frequently in many aspects of robot and computer vision. It can become a time-consuming process when matching images to a database consisting of hundreds of images, especially if the images are big. One approach to reducing the time complexity of the matching process is to reduce the search space in a pre-matching stage, by simply removing dissimilar images quickly. The Probabilistic Matching Model for Binary Images (PMMBI) showed that dissimilarity detection between binary images can be accomplished quickly by random pixel mapping and is size invariant. The model is based on the gamma binary similarity distance that recognizes an image and its inverse as containing the same scene and hence considers them to be the same image. However, in many applications, an image and its inverse are not treated as being the same but rather dissimilar. In this paper, we present a comparative analysis of dissimilarity detection between PMMBI based on the gamma binary similarity distance and a modified PMMBI model based on a similarity distance that does distinguish between an image and its inverse as being dissimilar.Keywords: binary image, dissimilarity detection, probabilistic matching model for binary images, image mapping
Procedia PDF Downloads 1566187 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis
Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame
Abstract:
Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain
Procedia PDF Downloads 906186 Intelligent Diagnostic System of the Onboard Measuring Devices
Authors: Kyaw Zin Htut
Abstract:
In this article, the synthesis of the efficiency of intelligent diagnostic system in the aircraft measuring devices is described. The technology developments of the diagnostic system are considered based on the model errors of the gyro instruments, which are used to measure the parameters of the aircraft. The synthesis of the diagnostic intelligent system is considered on the example of the problem of assessment and forecasting errors of the gyroscope devices on the onboard aircraft. The result of the system is to detect of faults of the aircraft measuring devices as well as the analysis of the measuring equipment to improve the efficiency of its work.Keywords: diagnostic, dynamic system, errors of gyro instruments, model errors, assessment, prognosis
Procedia PDF Downloads 4026185 Offline High Voltage Diagnostic Test Findings on 15MVA Generator of Basochhu Hydropower Plant
Authors: Suprit Pradhan, Tshering Yangzom
Abstract:
Even with availability of the modern day online insulation diagnostic technologies like partial discharge monitoring, the measurements like Dissipation Factor (tanδ), DC High Voltage Insulation Currents, Polarization Index (PI) and Insulation Resistance Measurements are still widely used as a diagnostic tools to assess the condition of stator insulation in hydro power plants. To evaluate the condition of stator winding insulation in one of the generators that have been operated since 1999, diagnostic tests were performed on the stator bars of 15 MVA generators of Basochhu Hydropower Plant. This paper presents diagnostic study done on the data gathered from the measurements which were performed in 2015 and 2016 as part of regular maintenance as since its commissioning no proper aging data were maintained. Measurement results of Dissipation Factor, DC High Potential tests and Polarization Index are discussed with regard to their effectiveness in assessing the ageing condition of the stator insulation. After a brief review of the theoretical background, the strengths of each diagnostic method in detecting symptoms of insulation deterioration are identified. The interesting results observed from Basochhu Hydropower Plant is taken into consideration to conclude that Polarization Index and DC High Voltage Insulation current measurements are best suited for the detection of humidity and contamination problems and Dissipation Factor measurement is a robust indicator of long-term ageing caused by oxidative degradation.Keywords: dissipation Factor (tanδ), polarization Index (PI), DC High Voltage Insulation Current, insulation resistance (IR), Tan Delta Tip-Up, dielectric absorption ratio
Procedia PDF Downloads 3176184 Quick Similarity Measurement of Binary Images via Probabilistic Pixel Mapping
Authors: Adnan A. Y. Mustafa
Abstract:
In this paper we present a quick technique to measure the similarity between binary images. The technique is based on a probabilistic mapping approach and is fast because only a minute percentage of the image pixels need to be compared to measure the similarity, and not the whole image. We exploit the power of the Probabilistic Matching Model for Binary Images (PMMBI) to arrive at an estimate of the similarity. We show that the estimate is a good approximation of the actual value, and the quality of the estimate can be improved further with increased image mappings. Furthermore, the technique is image size invariant; the similarity between big images can be measured as fast as that for small images. Examples of trials conducted on real images are presented.Keywords: big images, binary images, image matching, image similarity
Procedia PDF Downloads 2026183 Motion of an Infinitesimal Particle in Binary Stellar Systems: Kepler-34, Kepler-35, Kepler-16, Kepler-413
Authors: Rajib Mia, Badam Singh Kushvah
Abstract:
The present research was motivated by the recent discovery of the binary star systems. In this paper, we use the restricted three-body problem in the binary stellar systems, considering photogravitational effects of both the stars. The aim of this study is to investigate the motion of the infinitesimal mass in the vicinity of the Lagrangian points. The stability and periodic orbits of collinear points and the stability and trajectories of the triangular points are studied in stellar binary systems Kepler-34, Kepler-35, Kepler-413 and Kepler-16 systems. A detailed comparison is made among periodic orbits and trajectories.Keywords: exoplanetary systems, lagrangian points, periodic orbit, restricted three body problem, stability
Procedia PDF Downloads 4376182 Spectral Analysis Approaches for Simultaneous Determination of Binary Mixtures with Overlapping Spectra: An Application on Pseudoephedrine Sulphate and Loratadine
Authors: Sara El-Hanboushy, Hayam Lotfy, Yasmin Fayez, Engy Shokry, Mohammed Abdelkawy
Abstract:
Simple, specific, accurate and precise spectrophotometric methods are developed and validated for simultaneous determination of pseudoephedrine sulphate (PSE) and loratadine (LOR) in combined dosage form based on spectral analysis technique. Pseudoephedrine (PSE) in binary mixture could be analyzed either by using its resolved zero order absorption spectrum at its λ max 256.8 nm after subtraction of LOR spectrum or in presence of LOR spectrum by absorption correction method at 256.8 nm, dual wavelength (DWL) method at 254nm and 273nm, induced dual wavelength (IDWL) method at 256nm and 272nm and ratio difference (RD) method at 256nm and 262 nm. Loratadine (LOR) in the mixture could be analyzed directly at 280nm without any interference of PSE spectrum or at 250 nm using its recovered zero order absorption spectrum using constant multiplication(CM).In addition, simultaneous determination for PSE and LOR in their mixture could be applied by induced amplitude modulation method (IAM) coupled with amplitude multiplication (PM).Keywords: dual wavelength (DW), induced amplitude modulation method (IAM) coupled with amplitude multiplication (PM), loratadine, pseudoephedrine sulphate, ratio difference (RD)
Procedia PDF Downloads 3266181 Binary Metal Oxide Catalysts for Low-Temperature Catalytic Oxidation of HCHO in Air
Authors: Hanjie Xie, Raphael Semiat, Ziyi Zhong
Abstract:
It is well known that many oxidation reactions in nature are closely related to the origin and life activities. One of the features of these natural reactions is that they can proceed under mild conditions employing the oxidant of molecular oxygen (O₂) in the air and enzymes as catalysts. Catalysis is also a necessary part of life for human beings, as many chemical and pharmaceutical industrial processes need to use catalysts. However, most heterogeneous catalytic reactions must be run at high operational reaction temperatures and pressures. It is not strange that, in recent years, research interest has been redirected to green catalysis, e.g., trying to run catalytic reactions under relatively mild conditions as much as possible, which needs to employ green solvents, green oxidants such O₂, particularly air, and novel catalysts. This work reports the efficient binary Fe-Mn metal oxide catalysts for low-temperature formaldehyde (HCHO) oxidation, a toxic pollutant in the air, particularly in indoor environments. We prepared a series of nanosized FeMn oxide catalysts and found that when the molar ratio of Fe/Mn = 1:1, the catalyst exhibited the highest catalytic activity. At room temperature, we realized the complete oxidation of HCHO on this catalyst for 20 h with a high GHSV of 150 L g⁻¹ h⁻¹. After a systematic investigation of the catalyst structure and the reaction, we identified the reaction intermediates, including dioxymethylene, formate, carbonate, etc. It is found that the oxygen vacancies and the derived active oxygen species contributed to this high-low-temperature catalytic activity. These findings deepen the understanding of the catalysis of these binary Fe-Mn metal oxide catalysts.Keywords: oxygen vacancy, catalytic oxidation, binary transition oxide, formaldehyde
Procedia PDF Downloads 1356180 Diagnostic Assessment for Mastery Learning of Engineering Students with a Bayesian Network Model
Authors: Zhidong Zhang, Yingchen Yang
Abstract:
In this study, a diagnostic assessment model for Mastery Engineering Learning was established based on a group of undergraduate students who studied in an engineering course. A diagnostic assessment model can examine both students' learning process and report achievement results. One very unique characteristic is that the diagnostic assessment model can recognize the errors and anything blocking students in their learning processes. The feedback is provided to help students to know how to solve the learning problems with alternative strategies and help the instructor to find alternative pedagogical strategies in the instructional designs. Dynamics is a core course in which is a common course being shared by several engineering programs. This course is a very challenging for engineering students to solve the problems. Thus knowledge acquisition and problem-solving skills are crucial for student success. Therefore, developing an effective and valid assessment model for student learning are of great importance. Diagnostic assessment is such a model which can provide effective feedback for both students and instructor in the mastery of engineering learning.Keywords: diagnostic assessment, mastery learning, engineering, bayesian network model, learning processes
Procedia PDF Downloads 1556179 Binary Programming for Manufacturing Material and Manufacturing Process Selection Using Genetic Algorithms
Authors: Saleem Z. Ramadan
Abstract:
The material selection problem is concerned with the determination of the right material for a certain product to optimize certain performance indices in that product such as mass, energy density, and power-to-weight ratio. This paper is concerned about optimizing the selection of the manufacturing process along with the material used in the product under performance indices and availability constraints. In this paper, the material selection problem is formulated using binary programming and solved by genetic algorithm. The objective function of the model is to minimize the total manufacturing cost under performance indices and material and manufacturing process availability constraints.Keywords: optimization, material selection, process selection, genetic algorithm
Procedia PDF Downloads 423