Search results for: measurement and verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3101

Search results for: measurement and verification

2861 Measurement of Reverse Flow Generated at Cold Exit of Vortex Tube

Authors: Mohd Hazwan bin Yusof, Hiroshi Katanoda

Abstract:

In order to clarify the structure of the cold flow discharged from the vortex tube (VT), the pressure of the cold flow was measured, and a simple flow visualization technique using a 0.75 mm-diameter needle and an oily paint is made to study the reverse flow at the cold exit. It is clear that a negative pressure and positive pressure region exist at a certain pressure and cold fraction area, and that a reverse flow is observed in the negative pressure region.

Keywords: flow visualization, pressure measurement, reverse flow, vortex tube

Procedia PDF Downloads 486
2860 Measurement Errors and Misclassifications in Covariates in Logistic Regression: Bayesian Adjustment of Main and Interaction Effects and the Sample Size Implications

Authors: Shahadut Hossain

Abstract:

Measurement errors in continuous covariates and/or misclassifications in categorical covariates are common in epidemiological studies. Regression analysis ignoring such mismeasurements seriously biases the estimated main and interaction effects of covariates on the outcome of interest. Thus, adjustments for such mismeasurements are necessary. In this research, we propose a Bayesian parametric framework for eliminating deleterious impacts of covariate mismeasurements in logistic regression. The proposed adjustment method is unified and thus can be applied to any generalized linear and non-linear regression models. Furthermore, adjustment for covariate mismeasurements requires validation data usually in the form of either gold standard measurements or replicates of the mismeasured covariates on a subset of the study population. Initial investigation shows that adequacy of such adjustment depends on the sizes of main and validation samples, especially when prevalences of the categorical covariates are low. Thus, we investigate the impact of main and validation sample sizes on the adjusted estimates, and provide a general guideline about these sample sizes based on simulation studies.

Keywords: measurement errors, misclassification, mismeasurement, validation sample, Bayesian adjustment

Procedia PDF Downloads 384
2859 Opacity Synthesis with Orwellian Observers

Authors: Moez Yeddes

Abstract:

The property of opacity is widely used in the formal verification of security in computer systems and protocols. Opacity is a general language-theoretic scheme of many security properties of systems. Opacity is parametrized with framework in which several security properties of a system can be expressed. A secret behaviour of a system is opaque if a passive attacker can never deduce its occurrence from the system observation. Instead of considering the case of static observability where the set of observable events is fixed off-line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we introduce Orwellian partial observability where unobservable events are not revealed provided that downgrading events never occurs in the future of the trace. Orwellian partial observability is needed to model intransitive information flow. This Orwellian observability is knwon as ipurge function. We show in previous work how to verify opacity for regular secret is opaque for a regular language L w.r.t. an Orwellian projection is PSPACE-complete while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. In this paper, we address two problems of opacification of a regular secret ϕ for a regular language L w.r.t. an Orwellian projection: Given L and a secret ϕ ∈ L, the first problem consist to compute some minimal regular super-language M of L, if it exists, such that ϕ is opaque for M and the second consists to compute the supremal sub-language M′ of L such that ϕ is opaque for M′. We derive both language-theoretic characterizations and algorithms to solve these two dual problems.

Keywords: security policies, opacity, formal verification, orwellian observation

Procedia PDF Downloads 201
2858 Vortex Separator for More Accurate Air Dry-Bulb Temperature Measurement

Authors: Ahmed N. Shmroukh, I. M. S. Taha, A. M. Abdel-Ghany, M. Attalla

Abstract:

Fog systems application for cooling and humidification is still limited, although these systems require less initial cost compared with that of other cooling systems such as pad-and-fan systems. The undesirable relative humidity and air temperature inside the space which have been cooled or humidified are the main reasons for its limited use, which results from the poor control of fog systems. Any accurate control system essentially needs air dry bulb temperature as an input parameter. Therefore, the air dry-bulb temperature in the space needs to be measured accurately. The Scope of the present work is the separation of the fog droplets from the air in a fogged space to measure the air dry bulb temperature accurately. The separation is to be done in a small device inside which the sensor of the temperature measuring instrument is positioned. Vortex separator will be designed and used. Another reference device will be used for measuring the air temperature without separation. A comparative study will be performed to reach at the best device which leads to the most accurate measurement of air dry bulb temperature. The results showed that the proposed devices improved the measured air dry bulb temperature toward the correct direction over that of the free junction. Vortex device was the best. It respectively increased the temperature measured by the free junction in the range from around 2 to around 6°C for different fog on-off duration.

Keywords: fog systems, measuring air dry bulb temperature, temperature measurement, vortex separator

Procedia PDF Downloads 264
2857 Counting People Utilizing Space-Time Imagery

Authors: Ahmed Elmarhomy, K. Terada

Abstract:

An automated method for counting passerby has been proposed using virtual-vertical measurement lines. Space-time image is representing the human regions which are treated using the segmentation process. Different color space has been used to perform the template matching. A proper template matching has been achieved to determine direction and speed of passing people. Distinguish one or two passersby has been investigated using a correlation between passerby speed and the human-pixel area. Finally, the effectiveness of the presented method has been experimentally verified.

Keywords: counting people, measurement line, space-time image, segmentation, template matching

Procedia PDF Downloads 424
2856 Comparative Safety Performance Evaluation of Profiled Deck Composite Slab from the Use of Slope-Intercept and Partial Shear Methods

Authors: Izian Abd. Karim, Kachalla Mohammed, Nora Farah Abd Aznieta Aziz, Law Teik Hua

Abstract:

The economic use and ease of construction of profiled deck composite slab is marred with the complex and un-economic strength verification required for the serviceability and general safety considerations. Beside these, albeit factors such as shear span length, deck geometries and mechanical frictions greatly influence the longitudinal shear strength, that determines the ultimate strength of profiled deck composite slab, and number of methods available for its determination; partial shear and slope-intercept are the two methods according to Euro-code 4 provision. However, the complexity associated with shear behavior of profiled deck composite slab, the use of these methods in determining the load carrying capacities of such slab yields different and conflicting values. This couple with the time and cost constraint associated with the strength verification is a source of concern that draws more attentions nowadays, the issue is critical. Treating some of these known shear strength influencing factors as random variables, the load carrying capacity violation of profiled deck composite slab from the use of the two-methods defined according to Euro-code 4 are determined using reliability approach, and comparatively studied. The study reveals safety values from the use of m-k method shows good standing compared with that from the partial shear method.

Keywords: composite slab, first order reliability method, longitudinal shear, partial shear connection, slope-intercept

Procedia PDF Downloads 329
2855 Near Infrared Spectrometry to Determine the Quality of Milk, Experimental Design Setup and Chemometrics: Review

Authors: Meghana Shankara, Priyadarshini Natarajan

Abstract:

Infrared (IR) spectroscopy has revolutionized the way we look at materials around us. Unraveling the pattern in the molecular spectra of materials to analyze the composition and properties of it has been one of the most interesting challenges in modern science. Applications of the IR spectrometry are numerous in the field’s pharmaceuticals, health, food and nutrition, oils, agriculture, construction, polymers, beverage, fabrics and much more limited only by the curiosity of the people. Near Infrared (NIR) spectrometry is applied robustly in analyzing the solids and liquid substances because of its non-destructive analysis method. In this paper, we have reviewed the application of NIR spectrometry in milk quality analysis and have presented the modes of measurement applied in NIRS measurement setup, Design of Experiment (DoE), classification/quantification algorithms used in the case of milk composition prediction like Fat%, Protein%, Lactose%, Solids Not Fat (SNF%) along with different approaches for adulterant identification. We have also discussed the important NIR ranges for the chosen milk parameters. The performance metrics used in the comparison of the various Chemometric approaches include Root Mean Square Error (RMSE), R^2, slope, offset, sensitivity, specificity and accuracy

Keywords: chemometrics, design of experiment, milk quality analysis, NIRS measurement modes

Procedia PDF Downloads 240
2854 Evaluating Contextually Targeted Advertising with Attention Measurement

Authors: John Hawkins, Graham Burton

Abstract:

Contextual targeting is a common strategy for advertising that places marketing messages in media locations that are expected to be aligned with the target audience. There are multiple major challenges to contextual targeting: the ideal categorisation scheme needs to be known, as well as the most appropriate subsections of that scheme for a given campaign or creative. In addition, the campaign reach is typically limited when targeting becomes narrow, so a balance must be struck between requirements. Finally, refinement of the process is limited by the use of evaluation methods that are either rapid but non-specific (click through rates), or reliable but slow and costly (conversions or brand recall studies). In this study we evaluate the use of attention measurement as a technique for understanding the performance of targeting on the basis of specific contextual topics. We perform the analysis using a large scale dataset of impressions categorised using the iAB V2.0 taxonomy. We evaluate multiple levels of the categorisation hierarchy, using categories at different positions within an initial creative specific ranking. The results illustrate that measuring attention time is an affective signal for the performance of a specific creative within a specific context. Performance is sustained across a ranking of categories from one period to another.

Keywords: contextual targeting, digital advertising, attention measurement, marketing performance

Procedia PDF Downloads 79
2853 Orbit Determination Modeling with Graphical Demonstration

Authors: Assem M. F. Sallam, Ah. El-S. Makled

Abstract:

In this paper, there is an implementation, verification, and graphical demonstration of a software application, which can be used swiftly over different preliminary orbit determination methods. A passive orbit determination method is used in this study to determine the location of a satellite or a flying body. It is named a passive orbit determination because it depends on observation without the use of any aids (radio and laser) installed on satellite. In order to understand how these methods work and how their output is accurate when compared with available verification data, the built models help in knowing the different inputs used with each method. Output from the different orbit determination methods (Gibbs, Lambert, and Gauss) will be compared with each other and verified by the data obtained from Satellite Tool Kit (STK) application. A modified model including all of the orbit determination methods using the same input will be introduced to investigate different models output (orbital parameters) for the same input (azimuth, elevation, and time). Simulation software is implemented using MATLAB. A Graphical User Interface (GUI) application named OrDet is produced using the GUI of MATLAB. It includes all the available used inputs and it outputs the current Classical Orbital Elements (COE) of satellite under observation. Produced COE are then used to propagate for a complete revolution and plotted on a 3-D view. Modified model which uses an adapter to allow same input parameters, passes these parameters to the preliminary orbit determination methods under study. Result from all orbit determination methods yield exactly the same COE output, which shows the equality of concept in determination of satellite’s location, but with different numerical methods.

Keywords: orbit determination, STK, Matlab-GUI, satellite tracking

Procedia PDF Downloads 241
2852 Development of an Atmospheric Radioxenon Detection System for Nuclear Explosion Monitoring

Authors: V. Thomas, O. Delaune, W. Hennig, S. Hoover

Abstract:

Measurement of radioactive isotopes of atmospheric xenon is used to detect, locate and identify any confined nuclear tests as part of the Comprehensive Nuclear Test-Ban Treaty (CTBT). In this context, the Alternative Energies and French Atomic Energy Commission (CEA) has developed a fixed device to continuously measure the concentration of these fission products, the SPALAX process. During its atmospheric transport, the radioactive xenon will undergo a significant dilution between the source point and the measurement station. Regarding the distance between fixed stations located all over the globe, the typical volume activities measured are near 1 mBq m⁻³. To avoid the constraints induced by atmospheric dilution, the development of a mobile detection system is in progress; this system will allow on-site measurements in order to confirm or infringe a suspicious measurement detected by a fixed station. Furthermore, this system will use beta/gamma coincidence measurement technique in order to drastically reduce environmental background (which masks such activities). The detector prototype consists of a gas cell surrounded by two large silicon wafers, coupled with two square NaI(Tl) detectors. The gas cell has a sample volume of 30 cm³ and the silicon wafers are 500 µm thick with an active surface area of 3600 mm². In order to minimize leakage current, each wafer has been segmented into four independent silicon pixels. This cell is sandwiched between two low background NaI(Tl) detectors (70x70x40 mm³ crystal). The expected Minimal Detectable Concentration (MDC) for each radio-xenon is in the order of 1-10 mBq m⁻³. Three 4-channels digital acquisition modules (Pixie-NET) are used to process all the signals. Time synchronization is ensured by a dedicated PTP-network, using the IEEE 1588 Precision Time Protocol. We would like to present this system from its simulation to the laboratory tests.

Keywords: beta/gamma coincidence technique, low level measurement, radioxenon, silicon pixels

Procedia PDF Downloads 104
2851 Comparative Evaluation of EBT3 Film Dosimetry Using Flat Bad Scanner, Densitometer and Spectrophotometer Methods and Its Applications in Radiotherapy

Authors: K. Khaerunnisa, D. Ryangga, S. A. Pawiro

Abstract:

Over the past few decades, film dosimetry has become a tool which is used in various radiotherapy modalities, either for clinical quality assurance (QA) or dose verification. The response of the film to irradiation is usually expressed in optical density (OD) or net optical density (netOD). While the film's response to radiation is not linear, then the use of film as a dosimeter must go through a calibration process. This study aimed to compare the function of the calibration curve of various measurement methods with various densitometer, using a flat bad scanner, point densitometer and spectrophotometer. For every response function, a radichromic film calibration curve is generated from each method by performing accuracy, precision and sensitivity analysis. netOD is obtained by measuring changes in the optical density (OD) of the film before irradiation and after irradiation when using a film scanner if it uses ImageJ to extract the pixel value of the film on the red channel of three channels (RGB), calculate the change in OD before and after irradiation when using a point densitometer, and calculate changes in absorbance before and after irradiation when using a spectrophotometer. the results showed that the three calibration methods gave readings with a netOD precision of doses below 3% for the uncertainty value of 1σ (one sigma). while the sensitivity of all three methods has the same trend in responding to film readings against radiation, it has a different magnitude of sensitivity. while the accuracy of the three methods provides readings below 3% for doses above 100 cGy and 200 cGy, but for doses below 100 cGy found above 3% when using point densitometers and spectrophotometers. when all three methods are used for clinical implementation, the results of the study show accuracy and precision below 2% for the use of scanners and spectrophotometers and above 3% for precision and accuracy when using point densitometers.

Keywords: Callibration Methods, Film Dosimetry EBT3, Flat Bad Scanner, Densitomete, Spectrophotometer

Procedia PDF Downloads 101
2850 Labour Productivity Measurement and Control Standards for Hotels

Authors: Kristine Joy Simpao

Abstract:

Improving labour productivity is one of the most enthralling and challenging aspects of managing hotels and restaurant business. The demand to secure countless productivity became an increasingly pivotal role of managers to survive and sustain the business. Besides making business profitable, they are in the doom to make every resource to become productive and effective towards achieving company goal while maximizing the value of organization. This paper examines what productivity means to the services industry, in particular, to the hotel industry. This is underpinned by an investigation of the extent of practice of respondent hotels to the labour productivity aspect in the areas of materials management, human resource management and leadership management and in a way, computing the labour productivity ratios using the hotel simple ratios of productivity in order to find a suitable measurement and control standards for hotels with SBMA, Olongapo City as the locale of the study. The finding shows that hotels labour productivity ratings are not perfect with some practices that are far below particularly on strategic and operational decisions in improving performance and productivity of its human resources. It further proves of the no significant difference ratings among the respondent’s type in all areas which indicated that they are having similar perception of the weak implementation of some of the indicators in the labour productivity practices. Furthermore, the results in the computation of labour productivity efficiency ratios resulted relationship of employees versus labour productivity practices are inversely proportional. This study provides a potential measurement and control standards for the enhancement of hotels labour productivity. These standards should also contain labour productivity customized for standard hotels in Subic Bay Freeport Zone to assist hotel owners in increasing the labour productivity while meeting company goals and objectives effectively.

Keywords: labour productivity, hotel, measurement and control, standards, efficiency ratios, practices

Procedia PDF Downloads 290
2849 Flexural Strengthening of Steel Beams Using Fiber Reinforced Polymers

Authors: Sally Hosny, Mona G. Ibrahim, N. K. Hassan

Abstract:

Fiber reinforced polymers (FRP) is one of the most environmentally method for strengthening and retrofitting steel structure buildings. The behaviour of flexural strengthened steel I-beams using FRP was investigated. The finite element (FE) models were developed using ANSYS® as verification cases to simulate the experimental behaviour of using FRP strips to flexure strengthen steel I-beam. Two experimental studies were selected for verification; first examined the effect of different thicknesses and modulus of elasticity while the second studied the effect of applying different carbon fiber reinforced polymers (CFRP) bond lengths. The proposed FE models were in good agreement with the experimental results in terms of failure modes, load bearing capacities and strain distribution on CFRP strips. The verified FE models can be utilized to conduct a parametric study where various widths (40, 50, 60, 70 and 80 mm), thickness (1.2, 2 and 4 mm) and lengths (1500, 1700 and 1800 mm) of CFRP were analyzed. The results presented clearly revealed that the load bearing capacity was significantly increased (+7%) when the width and thickness were increased. However, load bearing capacity was slightly affected using longer CFRP strips. Moreover, applying another glass fiber reinforced polymers (GFRP) of 1500 mm in length, 50 mm in width and thicknesses of 1.2, 2 and 4 mm were investigated. Load bearing capacity of strengthened I-beams using GFRP is less than CFRP by average 8%. Statistical analysis has been conducted using Minitab®.

Keywords: FRP, strengthened steel I-beams, flexural, FEM, ANSYS

Procedia PDF Downloads 245
2848 Nutrition Strategy Using Traditional Tibetan Medicine in the Preventive Measurement

Authors: Ngawang Tsering

Abstract:

Traditional Tibetan medicine is primarily focused on promoting health and keeping away diseases from its unique in prescribing specific diet and lifestyle. The prevalence of chronic diseases has been rising day by day and kills a number of people due to the lack of proper nutritional design in modern times. According to traditional Tibetan medicine, chronic diseases such as diabetes, cancer, cardiovascular diseases, respiratory diseases, and arthritis are heavily associated with an unwholesome diet and inappropriate lifestyles. Diet and lifestyles are the two main conditions of diseases and healthy life. The prevalence of chronic diseases is one of the challenges, with massive economic impact and expensive health issues. Though chronic diseases are challenges, it has a solution in the preventive measurements by using proper nutrition design based on traditional Tibetan medicine. Until today, it is hard to evaluate whether traditional Tibetan medicine nutrition strategy could play a major role in preventive measurement as of the lack of current research evidence. However, compared with modern nutrition, it has an exclusive valuable concept, such as a holistic way and diet or nutrition recommendation based on different aspects. Traditional Tibetan medicine is one of the oldest ancient existing medical systems known as Sowa Rigpa (Science of Healing) highlights different aspects of dietetics and nutrition, namely geographical, seasonal, age, personality, emotional, food combination, the process of individual metabolism, potency, and amount of food. This article offers a critical perspective on the preventive measurement against chronic diseases through nutrition design using traditional Tibetan medicine and also needs attention for a deeper understanding of traditional Tibetan medicine in the modern world.

Keywords: traditional Tibetan medicine, nutrition, chronic diseases, preventive measurement, holistic approach, integrative

Procedia PDF Downloads 128
2847 A Wireless Sensor System for Continuous Monitoring of Particulate Air Pollution

Authors: A. Yawootti, P. Intra, P. Sardyoung, P. Phoosomma, R. Puttipattanasak, S. Leeragreephol, N. Tippayawong

Abstract:

The aim of this work is to design, develop and test the low-cost implementation of a particulate air pollution sensor system for continuous monitoring of outdoors and indoors particulate air pollution at a lower cost than existing instruments. In this study, measuring electrostatic charge of particles technique via high efficiency particulate-free air filter was carried out. The developed detector consists of a PM10 impactor, a particle charger, a Faraday cup electrometer, a flow meter and controller, a vacuum pump, a DC high voltage power supply and a data processing and control unit. It was reported that the developed detector was capable of measuring mass concentration of particulate ranging from 0 to 500 µg/m3 corresponding to number concentration of particulate ranging from 106 to 1012 particles/m3 with measurement time less than 1 sec. The measurement data of the sensor connects to the internet through a GSM connection to a public cellular network. In this development, the apparatus was applied the energy by a 12 V, 7 A internal battery for continuous measurement of about 20 hours. Finally, the developed apparatus was found to be close agreement with the import standard instrument, portable and benefit for air pollution and particulate matter measurements.

Keywords: particulate, air pollution, wireless communication, sensor

Procedia PDF Downloads 339
2846 Air-Coupled Ultrasonic Testing for Non-Destructive Evaluation of Various Aerospace Composite Materials by Laser Vibrometry

Authors: J. Vyas, R. Kazys, J. Sestoke

Abstract:

Air-coupled ultrasonic is the contactless ultrasonic measurement approach which has become widespread for material characterization in Aerospace industry. It is always essential for the requirement of lightest weight, without compromising the durability. To archive the requirements, composite materials are widely used. This paper yields analysis of the air-coupled ultrasonics for composite materials such as CFRP (Carbon Fibre Reinforced Polymer) and GLARE (Glass Fiber Metal Laminate) and honeycombs for the design of modern aircrafts. Laser vibrometry could be the key source of characterization for the aerospace components. The air-coupled ultrasonics fundamentals, including principles, working modes and transducer arrangements used for this purpose is also recounted in brief. The emphasis of this paper is to approach the developed NDT techniques based on the ultrasonic guided waves applications and the possibilities of use of laser vibrometry in different materials with non-contact measurement of guided waves. 3D assessment technique which employs the single point laser head using, automatic scanning relocation of the material to assess the mechanical displacement including pros and cons of the composite materials for aerospace applications with defects and delaminations.

Keywords: air-coupled ultrasonics, contactless measurement, laser interferometry, NDT, ultrasonic guided waves

Procedia PDF Downloads 212
2845 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models

Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue

Abstract:

Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.

Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation

Procedia PDF Downloads 227
2844 A Study of Adaptive Fault Detection Method for GNSS Applications

Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee

Abstract:

A purpose of this study is to develop efficient detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive estimation. Due to dependence of radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. Thus, to utilize GNSS for aerospace or ground vehicles requiring high level of safety, unhealthy measurements should be considered seriously. For the reason, this paper proposes adaptive fault detection method to deal with unhealthy measurements in various harsh environments. By the proposed method, the test statistics for fault detection is generated by estimated measurement noise. Pseudorange and carrier-phase measurement noise are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. Performance of the proposed method was evaluated by field-collected GNSS measurements. To evaluate the fault detection capability, intentional faults were added to measurements. The experimental result shows that the proposed detection method is efficient in detecting unhealthy measurements and improves the accuracy of GNSS positioning under fault occurrence.

Keywords: adaptive estimation, fault detection, GNSS, residual

Procedia PDF Downloads 544
2843 Raman Spectral Fingerprints of Healthy and Cancerous Human Colorectal Tissues

Authors: Maria Karnachoriti, Ellas Spyratou, Dimitrios Lykidis, Maria Lambropoulou, Yiannis S. Raptis, Ioannis Seimenis, Efstathios P. Efstathopoulos, Athanassios G. Kontos

Abstract:

Colorectal cancer is the third most common cancer diagnosed in Europe, according to the latest incidence data provided by the World Health Organization (WHO), and early diagnosis has proved to be the key in reducing cancer-related mortality. In cases where surgical interventions are required for cancer treatment, the accurate discrimination between healthy and cancerous tissues is critical for the postoperative care of the patient. The current study focuses on the ex vivo handling of surgically excised colorectal specimens and the acquisition of their spectral fingerprints using Raman spectroscopy. Acquired data were analyzed in an effort to discriminate, in microscopic scale, between healthy and malignant margins. Raman spectroscopy is a spectroscopic technique with high detection sensitivity and spatial resolution of few micrometers. The spectral fingerprint which is produced during laser-tissue interaction is unique and characterizes the biostructure and its inflammatory or cancer state. Numerous published studies have demonstrated the potential of the technique as a tool for the discrimination between healthy and malignant tissues/cells either ex vivo or in vivo. However, the handling of the excised human specimens and the Raman measurement conditions remain challenging, unavoidably affecting measurement reliability and repeatability, as well as the technique’s overall accuracy and sensitivity. Therefore, tissue handling has to be optimized and standardized to ensure preservation of cell integrity and hydration level. Various strategies have been implemented in the past, including the use of balanced salt solutions, small humidifiers or pump-reservoir-pipette systems. In the current study, human colorectal specimens of 10X5 mm were collected from 5 patients up to now who underwent open surgery for colorectal cancer. A novel, non-toxic zinc-based fixative (Z7) was used for tissue preservation. Z7 demonstrates excellent protein preservation and protection against tissue autolysis. Micro-Raman spectra were recorded with a Renishaw Invia spectrometer from successive random 2 micrometers spots upon excitation at 785 nm to decrease fluorescent background and secure avoidance of tissue photodegradation. A temperature-controlled approach was adopted to stabilize the tissue at 2 °C, thus minimizing dehydration effects and consequent focus drift during measurement. A broad spectral range, 500-3200 cm-1,was covered with five consecutive full scans that lasted for 20 minutes in total. The average spectra were used for least square fitting analysis of the Raman modes.Subtle Raman differences were observed between normal and cancerous colorectal tissues mainly in the intensities of the 1556 cm-1 and 1628 cm-1 Raman modes which correspond to v(C=C) vibrations in porphyrins, as well as in the range of 2800-3000 cm-1 due to CH2 stretching of lipids and CH3 stretching of proteins. Raman spectra evaluation was supported by histological findings from twin specimens. This study demonstrates that Raman spectroscopy may constitute a promising tool for real-time verification of clear margins in colorectal cancer open surgery.

Keywords: colorectal cancer, Raman spectroscopy, malignant margins, spectral fingerprints

Procedia PDF Downloads 68
2842 Breath Ethanol Imaging System Using Real Time Biochemical Luminescence for Evaluation of Alcohol Metabolic Capacity

Authors: Xin Wang, Munkbayar Munkhjargal, Kumiko Miyajima, Takahiro Arakawa, Kohji Mitsubayashi

Abstract:

The measurement of gaseous ethanol plays an important role of evaluation of alcohol metabolic capacity in clinical and forensic analysis. A 2-dimensional visualization system for gaseous ethanol was constructed and tested in visualization of breath and transdermal alcohol. We demonstrated breath ethanol measurement using developed high-sensitive visualization system. The concentration of breath ethanol calculated with the imaging signal was significantly different between the volunteer subjects of ALDH2 (+) and (-).

Keywords: breath ethanol, ethnaol imaging, biochemical luminescence, alcohol metabolism

Procedia PDF Downloads 323
2841 Using a GIS-Based Method for Green Infrastructure Accessibility of Different Socio-Economic Groups in Auckland, New Zealand

Authors: Jing Ma, Xindong An

Abstract:

Green infrastructure, the most important aspect of improving the quality of life, has been a crucial element of the liveability measurement. With demanding of more liveable urban environment from increasing population in city area, access to green infrastructure in walking distance should be taken into consideration. This article exemplifies the study on accessibility measurement of green infrastructure in central Auckland (New Zealand), using network analysis tool on the basis of GIS, to verify the accessibility levels of green infrastructure. It analyses the overall situation of green infrastructure and draws some conclusions on the city’s different levels of accessibility according to the categories and facilities distribution, which provides valuable references and guidance for the future facility improvement in planning strategies.

Keywords: quality of life, green infrastructure, GIS, accessibility

Procedia PDF Downloads 245
2840 A Vehicle Detection and Speed Measurement Algorithm Based on Magnetic Sensors

Authors: Panagiotis Gkekas, Christos Sougles, Dionysios Kehagias, Dimitrios Tzovaras

Abstract:

Cooperative intelligent transport systems (C-ITS) can greatly improve safety and efficiency in road transport by enabling communication, not only between vehicles themselves but also between vehicles and infrastructure. For that reason, traffic surveillance systems on the road are of great importance. This paper focuses on the development of an on-road unit comprising several magnetic sensors for real-time vehicle detection, movement direction, and speed measurement calculations. Magnetic sensors can feel and measure changes in the earth’s magnetic field. Vehicles are composed of many parts with ferromagnetic properties. Depending on sensors’ sensitivity, changes in the earth’s magnetic field caused by passing vehicles can be detected and analyzed in order to extract information on the properties of moving vehicles. In this paper, we present a prototype algorithm for real-time, high-accuracy, vehicle detection, and speed measurement, which can be implemented as a portable, low-cost, and non-invasive to existing infrastructure solution with the potential to replace existing high-cost implementations. The paper describes the algorithm and presents results from its preliminary lab testing in a close to real condition environment. Acknowledgments: Work presented in this paper was co-financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship, and Innovation (call RESEARCH–CREATE–INNOVATE) under contract no. Τ1EDK-03081 (project ODOS2020).

Keywords: magnetic sensors, vehicle detection, speed measurement, traffic surveillance system

Procedia PDF Downloads 95
2839 Photovoltaic Cells Characteristics Measurement Systems

Authors: Rekioua T., Rekioua D., Aissou S., Ouhabi A.

Abstract:

Power provided by the photovoltaic array varies with solar radiation and temperature, since these parameters influence the electrical characteristic (Ipv-Vpv) of solar cells. In Scientific research, there are different methods to obtain these characteristics. In this paper, we present three methods. A simulation one using Matlab/Simulink. The second one is the standard experimental voltage method and the third one is by using LabVIEW software. This latter is based on an electronic circuit to test PV modules. All details of this electronic schemes are presented and obtained results of the three methods with a comparison and under different meteorological conditions are presented. The proposed method is simple and very efficiency for testing and measurements of electrical characteristic curves of photovoltaic panels.

Keywords: photovoltaic cells, measurement standards, temperature sensors, data acquisition

Procedia PDF Downloads 428
2838 Students' Errors in Translating Algebra Word Problems to Mathematical Structure

Authors: Ledeza Jordan Babiano

Abstract:

Translating statements into mathematical notations is one of the processes in word problem-solving. However, based on the literature, students still have difficulties with this skill. The purpose of this study was to investigate the translation errors of the students when they translate algebraic word problems into mathematical structures and locate the errors via the lens of the Translation-Verification Model. Moreover, this qualitative research study employed content analysis. During the data-gathering process, the students were asked to answer a six-item algebra word problem questionnaire, and their answers were analyzed by experts through blind coding using the Translation-Verification Model to determine their translation errors. After this, a focus group discussion was conducted, and the data gathered was analyzed through thematic analysis to determine the causes of the students’ translation errors. It was found out that students’ prevalent error in translation was the interpretation error, which was situated in the Attribute construct. The emerging themes during the FGD were: (1) The procedure of translation is strategically incorrect; (2) Lack of comprehension; (3) Algebra concepts related to difficulty; (4) Lack of spatial skills; (5) Unprepared for independent learning; and (6) The content of the problem is developmentally inappropriate. These themes boiled down to the major concept of independent learning preparedness in solving mathematical problems. This concept has subcomponents, which include contextual and conceptual factors in translation. Consequently, the results provided implications for instructors and professors in Mathematics to innovate their teaching pedagogies and strategies to address translation gaps among students.

Keywords: mathematical structure, algebra word problems, translation, errors

Procedia PDF Downloads 22
2837 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction

Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez

Abstract:

Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.

Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis

Procedia PDF Downloads 139
2836 Tele-Monitoring and Logging of Patient Health Parameters Using Zigbee

Authors: Kirubasankar, Sanjeevkumar, Aravindh Nagappan

Abstract:

This paper addresses a system for monitoring patients using biomedical sensors and displaying it in a remote place. The main challenges in present health monitoring devices are lack of remote monitoring and logging for future evaluation. Typical instruments used for health parameter measurement provide basic information regarding health status. This paper identifies a set of design principles to address these challenges. This system includes continuous measurement of health parameters such as Heart rate, electrocardiogram, SpO2 level and Body temperature. The accumulated sensor data is relayed to a processing device using a transceiver and viewed by the implementation of cloud services.

Keywords: bio-medical sensors, monitoring, logging, cloud service

Procedia PDF Downloads 486
2835 Multi-Focus Image Fusion Using SFM and Wavelet Packet

Authors: Somkait Udomhunsakul

Abstract:

In this paper, a multi-focus image fusion method using Spatial Frequency Measurements (SFM) and Wavelet Packet was proposed. The proposed fusion approach, firstly, the two fused images were transformed and decomposed into sixteen subbands using Wavelet packet. Next, each subband was partitioned into sub-blocks and each block was identified the clearer regions by using the Spatial Frequency Measurement (SFM). Finally, the recovered fused image was reconstructed by performing the Inverse Wavelet Transform. From the experimental results, it was found that the proposed method outperformed the traditional SFM based methods in terms of objective and subjective assessments.

Keywords: multi-focus image fusion, wavelet packet, spatial frequency measurement

Procedia PDF Downloads 450
2834 Impact of Board Characteristics on Financial Performance: A Study of Manufacturing Sector of Pakistan

Authors: Saad Bin Nasir

Abstract:

The research will examine the role of corporate governance (CG) practices on firm’s financial performance. Population of this research will be manufacture sector of Pakistan. For the purposes of measurement of impact of corporate governance practices such as board size, board independence, ceo/chairman duality, will take as independent variables and for the measurement of firm’s performance return on assets and return on equity will take as dependent variables. Panel data regression model will be used to estimate the impact of CG on firm performance.

Keywords: corporate governance, board size, board independence, leadership

Procedia PDF Downloads 489
2833 In silico Analysis of a Causative Mutation in Cadherin-23 Gene Identified in an Omani Family with Hearing Loss

Authors: Mohammed N. Al Kindi, Mazin Al Khabouri, Khalsa Al Lamki, Tommasso Pappuci, Giovani Romeo, Nadia Al Wardy

Abstract:

Hereditary hearing loss is a heterogeneous group of complex disorders with an overall incidence of one in every five hundred newborns presented as syndromic and non-syndromic forms. Cadherin-related 23 (CDH23) is one of the listed deafness causative genes. CDH23 is found to be expressed in the stereocilia of hair cells and the retina photoreceptor cells. Defective CDH23 has been associated mostly with prelingual severe-to-profound sensorineural hearing loss (SNHL) in either syndromic (USH1D) or non-syndromic SNHL (DFNB12). An Omani family diagnosed clinically with severe-profound sensorineural hearing loss was genetically analysed by whole exome sequencing technique. A novel homozygous missense variant, c.A7451C (p.D2484A), in exon 53 of CDH23 was detected. One hundred and thirty control samples were analysed where all were negative for the detected variant. The variant was analysed in silico for pathogenicity verification using several mutation prediction software. The variant proved to be a pathogenic mutation and is reported for the first time in Oman and worldwide. It is concluded that in silico mutation prediction analysis might be used as a useful molecular diagnostics tool benefiting both genetic counseling and mutation verification. The aspartic acid 2484 alanine missense substitution might be the main disease-causing mutation that damages CDH23 function and could be used as a genetic hearing loss marker for this particular Omani family.

Keywords: Cdh23, d2484a, in silico, Oman

Procedia PDF Downloads 187
2832 Towards a Measurement-Based E-Government Portals Maturity Model

Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri

Abstract:

The e-government emerging concept transforms the way in which the citizens are dealing with their governments. Thus, the citizens can execute the intended services online anytime and anywhere. This results in great benefits for both the governments (reduces the number of officers) and the citizens (more flexibility and time saving). Therefore, building a maturity model to assess the e-government portals becomes desired to help in the improvement process of such portals. This paper aims at proposing an e-government maturity model based on the measurement of the best practices’ presence. The main benefit of such maturity model is to provide a way to rank an e-government portal based on the used best practices, and also giving a set of recommendations to go to the higher stage in the maturity model.

Keywords: best practices, e-government portal, maturity model, quality model

Procedia PDF Downloads 303