Search results for: elliptic curve cryptography
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1197

Search results for: elliptic curve cryptography

837 Effect of Microstructure and Texture of Magnesium Alloy Due to Addition of Pb

Authors: Yebeen Ji, Jimin Yun, Kwonhoo Kim

Abstract:

Magnesium alloys were limited for industrial applications due to having a limited slip system and high plastic anisotropy. It has been known that specific textures were formed during processing (rolling, etc.), and These textures cause poor formability. To solve these problems, many researchers have studied controlling texture by adding rare-earth elements. However, the high cost limits their use; therefore, alternatives are needed to replace them. Although Pb addition doesn’t directly improve magnesium properties, it has been known to suppress the diffusion of other alloying elements and reduce grain boundary energy. These characteristics are similar to the additions of rare-earth elements, and a similar texture behavior is expected as well. However, there is insufficient research on this. Therefore, this study investigates the behavior of texture and microstructure development after adding Pb to magnesium. This study compared and analyzed AZ61 alloy and Mg-15wt%Pb alloy to determine the effect of adding solute elements. The alloy was hot rolled and annealed to form a single phase and initial texture. Afterward, the specimen was set to contraction and elongate parallel to the rolling surface and the rolling direction and then subjected to high-temperature plane strain compression under the conditions of 723K and 0.05/s. Microstructural analysis and texture measurements were performed by SEM-EBSD. The peak stress in the true strain-stress curve after compression was higher in AZ61, but the shape of the flow curve was similar for both alloys. For both alloys, continuous dynamic recrystallization was confirmed to occur during the compression process. The basal texture developed parallel to the compressed surface, and the pole density was lower in the Mg-15wt%Pb alloy. It is confirmed that this change in behavior is because the orientation distribution of recrystallized grains has a more random orientation compared to the parent grains when Pb is added.

Keywords: Mg, texture, Pb, DRX

Procedia PDF Downloads 49
836 Active Part of the Burnishing Tool Effect on the Physico-Geometric Aspect of the Superficial Layer of 100C6 and 16NC6 Steels

Authors: Tarek Litim, Ouahiba Taamallah

Abstract:

Burnishing is a mechanical surface treatment that combines several beneficial effects on the two steel grades studied. The application of burnishing to the ball or to the tip favors a better roughness compared to turning. In addition, it allows the consolidation of the surface layers through work hardening phenomena. The optimal effects are closely related to the treatment parameters and the active part of the device. With an improvement of 78% on the roughness, burnishing can be defined as a finishing operation in the machining range. With a 44% gain in consolidation rate, this treatment is an effective process for material consolidation. These effects are affected by several factors. The factors V, f, P, r, and i have the most significant effects on both roughness and hardness. Ball or tip burnishing leads to the consolidation of the surface layers of both grades 100C6 and 16NC6 steels by work hardening. For each steel grade and its mechanical treatment, the rational tensile curve has been drawn. Lüdwick's law is used to better plot the work hardening curve. For both grades, a material hardening law is established. For 100C6 steel, these results show a work hardening coefficient and a consolidation rate of 0.513 and 44, respectively, compared to the surface layers processed by turning. When 16NC6 steel is processed, the work hardening coefficient is about 0.29. Hardness tests characterize well the burnished depth. The layer affected by work hardening can reach up to 0.4 mm. Simulation of the tests is of great importance to provide the details at the local scale of the material. Conventional tensile curves provide a satisfactory indication of the toughness of 100C6 and 16NC6 materials. A simulation of the tensile curves revealed good agreement between the experimental and simulation results for both steels.

Keywords: 100C6 steel, 16NC6 steel, burnishing, work hardening, roughness, hardness

Procedia PDF Downloads 168
835 Transcriptome Analysis Reveals Role of Long Non-Coding RNA NEAT1 in Dengue Patients

Authors: Abhaydeep Pandey, Shweta Shukla, Saptamita Goswami, Bhaswati Bandyopadhyay, Vishnampettai Ramachandran, Sudhanshu Vrati, Arup Banerjee

Abstract:

Background: Long non-coding RNAs (lncRNAs) are the important regulators of gene expression and play important role in viral replication and disease progression. The role of lncRNA genes in the pathogenesis of Dengue virus-mediated pathogenesis is currently unknown. Methods: To gain additional insights, we utilized an unbiased RNA sequencing followed by in silico analysis approach to identify the differentially expressed lncRNA and genes that are associated with dengue disease progression. Further, we focused our study on lncRNAs NEAT1 (Nuclear Paraspeckle Assembly Transcript 1) as it was found to be differentially expressed in PBMC of dengue infected patients. Results: The expression of lncRNAs NEAT1, as compared to dengue infection (DI), was significantly down-regulated as the patients developed the complication. Moreover, pairwise analysis on follow up patients confirmed that suppression of NEAT1 expression was associated with rapid fall in platelet count in dengue infected patients. Severe dengue patients (DS) (n=18; platelet count < 20K) when recovered from infection showing high NEAT1 expression as it observed in healthy donors. By co-expression network analysis and subsequent validation, we revealed that coding gene; IFI27 expression was significantly up-regulated in severe dengue cases and negatively correlated with NEAT1 expression. To discriminate DI from dengue severe, receiver operating characteristic (ROC) curve was calculated. It revealed sensitivity and specificity of 100% (95%CI: 85.69 – 97.22) and area under the curve (AUC) = 0.97 for NEAT1. Conclusions: Altogether, our first observations demonstrate that monitoring NEAT1and IFI27 expression in dengue patients could be useful in understanding dengue virus-induced disease progression and may be involved in pathophysiological processes.

Keywords: dengue, lncRNA, NEAT1, transcriptome

Procedia PDF Downloads 310
834 Mathematical Model to Quantify the Phenomenon of Democracy

Authors: Mechlouch Ridha Fethi

Abstract:

This paper presents a recent mathematical model in political sciences concerning democracy. The model is represented by a logarithmic equation linking the Relative Index of Democracy (RID) to Participation Ratio (PR). Firstly the meanings of the different parameters of the model were presented; and the variation curve of the RID according to PR with different critical areas was discussed. Secondly, the model was applied to a virtual group where we show that the model can be applied depending on the gender. Thirdly, it was observed that the model can be extended to different language models of democracy and that little use to assess the state of democracy for some International organizations like UNO.

Keywords: democracy, mathematic, modelization, quantification

Procedia PDF Downloads 368
833 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.

Keywords: CT simulator, radiotherapy, quality control, QA programme

Procedia PDF Downloads 532
832 Riemannain Geometries Of Visual Space

Authors: Jacek Turski

Abstract:

The visual space geometries are constructed in the Riemannian geometry framework from simulated iso-disparity conics in the horizontalvisual plane of the binocular system with the asymmetric eyes (AEs). For the eyes fixating at the abathic distance, which depends on the AE’s parameters, the iso-disparity conics are frontal straight lines in physical space. For allother fixations, the iso-disparity conics consist of families of the ellipses or hyperbolas depending on both the AE’s parameters and the bifoveal fixation. However, the iso-disparity conic’s arcs are perceived in the gaze direction asthe frontal lines and are referred to as visual geodesics. Thus, geometriesof physical and visual spaces are different. A simple postulate that combines simulated iso-disparity conics with basic anatomy od the human visual system gives the relative depth for the fixation at the abathic distance that establishes the Riemann matric tensor. The resulting geodesics are incomplete in the gaze direction and, therefore, give thefinite distances to the horizon that depend on the AE’s parameters. Moreover, the curvature vanishes in this eyes posture such that visual space is flat. For all other fixations, only the sign of the curvature canbe inferred from the global behavior of the simulated iso-disparity conics: the curvature is positive for the elliptic iso-disparity curves and negative for the hyperbolic iso-disparity curves.

Keywords: asymmetric eye model, iso-disparity conics, metric tensor, geodesics, curvature

Procedia PDF Downloads 145
831 Modified RSA in Mobile Communication

Authors: Nagaratna Rajur, J. D. Mallapur, Y. B. Kirankumar

Abstract:

The security in mobile communication is very different from the internet or telecommunication, because of its poor user interface and limited processing capacity, as well as combination of complex network protocols. Hence, it poses a challenge for less memory usage and low computation speed based security system. Security involves all the activities that are undertaken to protect the value and on-going usability of assets and the integrity and continuity of operations. An effective network security strategies requires identifying threats and then choosing the most effective set of tools to combat them. Cryptography is a simple and efficient way to provide security in communication. RSA is an asymmetric key approach that is highly reliable and widely used in internet communication. However, it has not been efficiently implemented in mobile communication due its computational complexity and large memory utilization. The proposed algorithm modifies the current RSA to be useful in mobile communication by reducing its computational complexity and memory utilization.

Keywords: M-RSA, sensor networks, sensor applications, security

Procedia PDF Downloads 342
830 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 65
829 Analytical Solution of the Boundary Value Problem of Delaminated Doubly-Curved Composite Shells

Authors: András Szekrényes

Abstract:

Delamination is one of the major failure modes in laminated composite structures. Delamination tips are mostly captured by spatial numerical models in order to predict crack growth. This paper presents some mechanical models of delaminated composite shells based on shallow shell theories. The mechanical fields are based on a third-order displacement field in terms of the through-thickness coordinate of the laminated shell. The undelaminated and delaminated parts are captured by separate models and the continuity and boundary conditions are also formulated in a general way providing a large size boundary value problem. The system of differential equations is solved by the state space method for an elliptic delaminated shell having simply supported edges. The comparison of the proposed and a numerical model indicates that the primary indicator of the model is the deflection, the secondary is the widthwise distribution of the energy release rate. The model is promising and suitable to determine accurately the J-integral distribution along the delamination front. Based on the proposed model it is also possible to develop finite elements which are able to replace the computationally expensive spatial models of delaminated structures.

Keywords: J-integral, levy method, third-order shell theory, state space solution

Procedia PDF Downloads 131
828 Bypassing Docker Transport Layer Security Using Remote Code Execution

Authors: Michael J. Hahn

Abstract:

Docker is a powerful tool used by many companies such as PayPal, MetLife, Expedia, Visa, and many others. Docker works by bundling multiple applications, binaries, and libraries together on top of an operating system image called a container. The container runs on a Docker engine that in turn runs on top of a standard operating system. This centralization saves a lot of system resources. In this paper, we will be demonstrating how to bypass Transport Layer Security and execute remote code within Docker containers built on a base image of Alpine Linux version 3.7.0 through the use of .apk files due to flaws in the Alpine Linux package management program. This exploit renders any applications built using Docker with a base image of Alpine Linux vulnerable to unwanted outside forces.

Keywords: cloud, cryptography, Docker, Linux, security

Procedia PDF Downloads 198
827 Urinary Exosome miR-30c-5p as a Biomarker for Early-Stage Clear Cell Renal Cell Carcinoma

Authors: Shangqing Song, Bin Xu, Yajun Cheng, Zhong Wang

Abstract:

miRNAs derived from exosomes exist in a body fluid such as urine were regarded as potential biomarkers for various human cancers diagnosis and prognosis, as mature miRNAs can be steadily preserved by exosomes. However, its potential value in clear cell renal cell carcinoma (ccRCC) diagnosis and prognosis remains unclear. In the present study, differentially expressed miRNAs from urinal exosomes were identified by next-generation sequencing (NGS) technology. The 16 differentially expressed miRNAs were identified between ccRCC patients and healthy donors. To explore the specific diagnosis biomarker of ccRCC, we validated these urinary exosomes from 70 early-stage renal cancer patients, 30 healthy people and other urinary system cancers, including 30 early-stage prostate cancer patients and 30 early-stage bladder cancer patients by qRT-PCR. The results showed that urinary exosome miR-30c-5p could be stably amplified and meanwhile the expression of miR-30c-5p has no significant difference between other urinary system cancers and healthy control, however, expression level of miR-30c-5p in urinary exosomal of ccRCC patients was lower than healthy people and receiver operation characterization (ROC) curve showed that the area under the curve (AUC) values was 0.8192 (95% confidence interval was 0.7388-0.8996, P= 0.0000). In addition, up-regulating miR-30c-5p expression could inhibit renal cell carcinoma cells growth. Lastly, HSP5A was found as a direct target gene of miR-30c-5p. HSP5A depletion reversed the promoting effect of ccRCC growth casued by miR-30c-5p inhibitor, respectively. In conclusion, this study demonstrated that urinary exosomal miR-30c-5p is readily accessible as diagnosis biomarker of early-stage ccRCC, and miR-30c-5p might modulate the expression of HSPA5, which correlated with the progression of ccRCC.

Keywords: clear cell renal cell carcinoma, exosome, HSP5A, miR-30c-5p

Procedia PDF Downloads 267
826 Palyno-Morphological Characteristics of Gymnosperm Flora of Pakistan and Its Taxonomic Implications with Light Microscope and Scanning Electron Microscopy Methods

Authors: Raees Khan, Sheikh Z. Ul Abidin, Abdul S. Mumtaz, Jie Liu

Abstract:

The present study is intended to assess gymnosperms pollen flora of Pakistan using Light Microscope (LM) and Scanning Electron Microscopy (SEM) for its taxonomic significance in identification of gymnosperms. Pollens of 35 gymnosperm species (12 genera and five families) were collected from its various distributional sites of gymnosperms in Pakistan. LM and SEM were used to investigate different palyno-morphological characteristics. Five pollen types (i.e., Inaperturate, Monolete, Monoporate, Vesiculate-bisaccate, and Polyplicate) were observed. In equatorial view seven types of pollens were observed, in which ten species were sub-angular, nine species were triangular, six species were perprolate, three species were rhomboidal, three species were semi-angular, two species were rectangular and two species were prolate. While five types of pollen were observed in polar view, in which ten species were spheroidal, nine species were angular, eight were interlobate, six species were circular, and two species were elliptic. Eighteen species have rugulate and 17 species has faveolate ornamentation. Eighteen species have verrucate and 17 have gemmate type sculpturing. The data was analysed through cluster analysis. The study showed that these palyno-morphological features have significance value in classification and identification of gymnosperms. Based on these different palyno-morphological features, a taxonomic key was proposed for the accurate and fast identifications of gymnosperms from Pakistan.

Keywords: gymnosperms, palynology, Pakistan, taxonomy

Procedia PDF Downloads 221
825 Evaluation of Hepatic Metabolite Changes for Differentiation Between Non-Alcoholic Steatohepatitis and Simple Hepatic Steatosis Using Long Echo-Time Proton Magnetic Resonance Spectroscopy

Authors: Tae-Hoon Kim, Kwon-Ha Yoon, Hong Young Jun, Ki-Jong Kim, Young Hwan Lee, Myeung Su Lee, Keum Ha Choi, Ki Jung Yun, Eun Young Cho, Yong-Yeon Jeong, Chung-Hwan Jun

Abstract:

Purpose: To assess the changes of hepatic metabolite for differentiation between non-alcoholic steatohepatitis (NASH) and simple steatosis on proton magnetic resonance spectroscopy (1H-MRS) in both humans and animal model. Methods: The local institutional review board approved this study and subjects gave written informed consent. 1H-MRS measurements were performed on a localized voxel of the liver using a point-resolved spectroscopy (PRESS) sequence and hepatic metabolites of alanine (Ala), lactate/triglyceride (Lac/TG), and TG were analyzed in NASH, simple steatosis and control groups. The group difference was tested with the ANOVA and Tukey’s post-hoc tests, and diagnostic accuracy was tested by calculating the area under the receiver operating characteristics (ROC) curve. The associations between metabolic concentration and pathologic grades or non-alcoholic fatty liver disease(NAFLD) activity scores were assessed by the Pearson’s correlation. Results: Patient with NASH showed the elevated Ala(p<0.001), Lac/TG(p < 0.001), TG(p < 0.05) concentration when compared with patients who had simple steatosis and healthy controls. The NASH patients were higher levels in Ala(mean±SEM, 52.5±8.3 vs 2.0±0.9; p < 0.001), Lac/TG(824.0±168.2 vs 394.1±89.8; p < 0.05) than simple steatosis. The area under the ROC curve to distinguish NASH from simple steatosis was 1.00 (95% confidence interval; 1.00, 1.00) with Ala and 0.782 (95% confidence interval; 0.61, 0.96) with Lac/TG. The Ala and Lac/TG levels were well correlated with steatosis grade, lobular inflammation, and NAFLD activity scores. The metabolic changes in human were reproducible to a mice model induced by streptozotocin injection and a high-fat diet. Conclusion: 1H-MRS would be useful for differentiation of patients with NASH and simple hepatic steatosis.

Keywords: non-alcoholic fatty liver disease, non-alcoholic steatohepatitis, 1H MR spectroscopy, hepatic metabolites

Procedia PDF Downloads 325
824 Biases in Numerically Invariant Joint Signatures

Authors: Reza Aghayan

Abstract:

This paper illustrates that numerically invariant joint signatures suffer biases in the resulting signatures. Next, we classify the arising biases as Bias Type 1 and Bias Type 2 and show how they can be removed.

Keywords: Euclidean and affine geometries, differential invariant signature curves, numerically invariant joint signatures, numerical analysis, numerical bias, curve analysis

Procedia PDF Downloads 597
823 Canada's "Flattened Curve": A Geospatial Temporal Analysis of Canada's Amelioration of the Sars-COV-2 Pandemic Through Coordinated Government Intervention

Authors: John Ahluwalia

Abstract:

As an affluent first-world nation, Canada took swift and comprehensive action during the outbreak of the SARS-CoV-2 (COVID-19) pandemic compared to other countries in the same socio-economic cohort. The United States has stumbled to overcome obstacles most developed nations have faced, which has led to significantly more per capita cases and deaths. The initial outbreaks of COVID-19 occurred in the US and Canada within days of each other and posed similar potentially catastrophic threats to public health, the economy, and governmental stability. On a macro level, events that take place in the US have a direct impact on Canada. For example, both countries tend to enter and exit economic recessions at approximately the same time, they are each other’s largest trading partners, and their currencies are inexorably linked. Why is it that Canada has not shared the same fate as the US (and many other nations) that have realized much worse outcomes relative to the COVID-19 pandemic? Variables intrinsic to Canada’s national infrastructure have been instrumental in the country’s efforts to flatten the curve of COVID-19 cases and deaths. Canada’s coordinated multi-level governmental effort has allowed it to create and enforce policies related to COVID-19 at both the national and provincial levels. Canada’s policy of universal healthcare is another variable. Health care and public health measures are enforced on a provincial level, and it is within each province’s jurisdiction to dictate standards for public safety based on scientific evidence. Rather than introducing confusion and the possibility of competition for resources such as PPE and vaccines, Canada’s multi-level chain of government authority has provided consistent policies supporting national public health and local delivery of medical care. This paper will demonstrate that the coordinated efforts on provincial and federal levels have been the linchpin in Canada’s relative success in containing the deadly spread of the COVID-19 virus.

Keywords: COVID-19, Canada, GIS, temporal analysis, ESRI

Procedia PDF Downloads 147
822 Coffee Consumption and Glucose Metabolism: a Systematic Review of Clinical Trials

Authors: Caio E. G. Reis, Jose G. Dórea, Teresa H. M. da Costa

Abstract:

Objective: Epidemiological data shows an inverse association of coffee consumption with risk of type 2 diabetes mellitus. However, the clinical effects of coffee consumption on the glucose metabolism biomarkers remain controversial. Thus, this paper reviews clinical trials that evaluated the effects of coffee consumption on glucose metabolism. Research Design and Methods: We identified studies published until December 2014 by searching electronic databases and reference lists. We included randomized clinical trials which the intervention group received caffeinated and/or decaffeinated coffee and the control group received water or placebo treatments and measured biomarkers of glucose metabolism. The Jadad Score was applied to evaluate the quality of the studies whereas studies that scored ≥ 3 points were considered for the analyses. Results: Seven clinical trials (total of 237 subjects) were analyzed involving adult healthy, overweight and diabetic subjects. The studies were divided in short-term (1 to 3h) and long-term (2 to 16 weeks) duration. The results for short-term studies showed that caffeinated coffee consumption may increase the area under the curve for glucose response, while for long-term studies caffeinated coffee may improve the glycemic metabolism by reducing the glucose curve and increasing insulin response. These results seem to show that the benefits of coffee consumption occur in the long-term as has been shown in the reduction of type 2 diabetes mellitus risk in epidemiological studies. Nevertheless, until the relationship between long-term coffee consumption and type 2 diabetes mellitus is better understood and any mechanism involved identified, it is premature to make claims about coffee preventing type 2 diabetes mellitus. Conclusion: The findings suggest that caffeinated coffee may impairs glucose metabolism in short-term but in the long-term the studies indicate reduction of type 2 diabetes mellitus risk. More clinical trials with comparable methodology are needed to unravel this paradox.

Keywords: coffee, diabetes mellitus type 2, glucose, insulin

Procedia PDF Downloads 466
821 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure

Authors: Esra Zengin, Sinan Akkar

Abstract:

Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.

Keywords: ground motion selection, scaling, uncertainty, fragility curve

Procedia PDF Downloads 583
820 Evaluation of the Antibacterial Effects of Turmeric Oleoresin, Capsicum Oleoresin and Garlic Essential Oil against Salmonella enterica Typhimurium

Authors: Jun Hyung Lee, Robin B. Guevarra, Jin Ho Cho, Bo-Ra Kim, Jiwon Shin, Doo Wan Kim, Young Hwa Kim, Minho Song, Hyeun Bum Kim

Abstract:

Salmonella is one of the most important swine pathogens, causing acute or chronic digestive diseases, such as enteritis. The acute form of enteritis is common in young pigs of 2-4 months of age. Salmonellosis in swine causes a huge economic burden to swine industry by reducing production. Therefore, it is necessary that swine industries should strive to decrease Salmonellosis in pigs in order to reduce economic losses. Thus, we tested three types of natural plant extracts(PEs) to evaluate antibacterial effects against Salmonella enterica Typhimurium isolated from the piglet with Salmonellosis. Three PEs including turmeric oleoresin (containing curcumin 79 to 85%), capsicum oleoresin (containing capsaicin 40%-40.1%), and garlic essential oil (100% natural garlic) were tested using the direct contact agar diffusion test, minimum inhibitory concentration test, growth curve assay, and heat stability test. The tests were conducted with PEs at each concentration of 2.5%, 5%, and 10%. For the heat stability test, PEs with 10% concentration were incubated at each 4, 20, 40, 60, 80, and 100 °C for 1 hour; then the direct contact agar diffusion test was used. For the positive and negative controls, 0.5N HCl and 1XPBS were used. All the experiments were duplicated. In the direct contact agar diffusion test, garlic essential oil with 2.5%, 5%, and 10% concentration showed inhibit zones of 1.5cm, 2.7cm, and 2.8cm diameters compared to that of 3.5cm diameter for 0.5N HCl. The minimum inhibited concentration of garlic essential oil was 2.5%. Growth curve assay showed that the garlic essential oil was able to inhibit Salmonella growth significantly after 4hours. The garlic essential oil retained the ability to inhibit Salmonella growth after heat treatment at each temperature. However, turmeric and capsicum oleoresins were not able to significantly inhibit Salmonella growth by all the tests. Even though further in-vivo tests will be needed to verify effects of garlic essential oil for the Salmonellosis prevention for piglets, our results showed that the garlic essential oil could be used as a potential natural agent to prevent Salmonellosis in swine.

Keywords: garlic essential oil, pig, salmonellosis, Salmonella enterica

Procedia PDF Downloads 173
819 Evaluation of the Antibacterial Effects of Turmeric Oleoresin, Capsicum Oleoresin and Garlic Essential Oil against Shiga Toxin-Producing Escherichia coli

Authors: Jun Hyung Lee, Robin B. Guevarra, Jin Ho Cho, Bo-Ra Kim, Jiwon Shin, Doo Wan Kim, Young Hwa Kim, Minho Song, Hyeun Bum Kim

Abstract:

Colibacillosis is one of the major health problems in young piglets ultimately resulting in their death, and it is common especially in young piglets. For the swine industry, colibacillosis is one of the important economic burdens. Therefore, it is necessary for the swine industries to prevent Colibacillosis in piglets in order to reduce economic losses. Thus, we tested three types of natural plant extracts (PEs) to evaluate antibacterial effects against Shiga toxin-producing Escherichia coli (STEC) isolated from the piglet. Three PEs including turmeric oleoresin (containing curcumin 79 to 85%), capsicum oleoresin (containing capsaicin 40%-40.1%), and garlic essential oil (100% natural garlic) were tested using the direct contact agar diffusion test, minimum inhibitory concentration test, growth curve assay, and heat stability test. The tests were conducted with PEs at each concentration of 2.5%, 5%, and 10%. For the heat stability test, PEs with 10% concentration were incubated at each 4, 20, 40, 60, 80, and 100 °C for 1 hour, then the direct contact agar diffusion test was used. For the positive and negative controls, 0.5N HCl and 1XPBS were used. All the experiments were duplicated. In the direct contact agar diffusion test, garlic essential oil with 2.5%, 5%, and 10% concentration showed inhibit zones of 1.1cm, 3.0cm, and 3.6 cm in diameters compared to that of 3.5cm diameter for 0.5N HCl. The minimum inhibited concentration of garlic essential oil was 2.5%. Growth curve assay showed that the garlic essential oil was able to inhibit STEC growth significantly after 4 hours. The garlic essential oil retained the ability to inhibit STEC growth after heat treatment at each temperature. However, turmeric and capsicum oleoresins were not able to significantly inhibit STEC growth by all the tests. Even though further tests using the piglets will be required to evaluate effects of garlic essential oil for the Colibacillosis prevention for piglets, our results showed that the garlic essential oil could be used as a potential natural agent to prevent Colibacillosis in swine.

Keywords: garlic essential oil, pig, Colibacillosis, Escherichia coli

Procedia PDF Downloads 258
818 Three-Dimensional Numerical Analysis of the Harmfulness of Defects in Oil Pipes

Authors: B. Medjadji, L. Aminallah, B. Serier, M. Benlebna

Abstract:

In this study, the finite element method in 3-D is used to calculate the integral J in the semi-elliptical crack in a pipe subjected to internal pressure. The stress-strain curve of the pipe has been determined experimentally. The J-integral was calculated in two fronts crack (Ф = 0 and Ф = π/2). The effect of the configuration of the crack on the J integral is analysed. The results show that an external longitudinal crack in a pipe is the most dangerous. It also shows that the increase in the applied pressure causes a remarkable increase of the integral J. The effect of the depth of the crack becomes important when the ratio between the depth of the crack and the thickness of the pipe (a / t) tends to 1.

Keywords: J integral, pipeline, crack, MEF

Procedia PDF Downloads 409
817 Robust Image Design Based Steganographic System

Authors: Sadiq J. Abou-Loukh, Hanan M. Habbi

Abstract:

This paper presents a steganography to hide the transmitted information without excite suspicious and also illustrates the level of secrecy that can be increased by using cryptography techniques. The proposed system has been implemented firstly by encrypted image file one time pad key and secondly encrypted message that hidden to perform encryption followed by image embedding. Then the new image file will be created from the original image by using four triangles operation, the new image is processed by one of two image processing techniques. The proposed two processing techniques are thresholding and differential predictive coding (DPC). Afterwards, encryption or decryption keys are generated by functional key generator. The generator key is used one time only. Encrypted text will be hidden in the places that are not used for image processing and key generation system has high embedding rate (0.1875 character/pixel) for true color image (24 bit depth).

Keywords: encryption, thresholding, differential predictive coding, four triangles operation

Procedia PDF Downloads 493
816 Observation of Critical Sliding Velocity

Authors: Visar Baxhuku, Halil Demolli, Alishukri Shkodra

Abstract:

This paper presents the monitoring of vehicle movement, namely the developing of speed of vehicles during movement in a certain twist. The basic geometry data of twist are measured with the purpose of calculating the slide in border speed. During the research, measuring developed speed of passenger vehicles for the real conditions of the road surface, dry road with average damage, was realised. After setting values, the analysis was done in function security of movement in twist.

Keywords: critical sliding velocity, moving velocity, curve, passenger vehicles

Procedia PDF Downloads 420
815 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 290
814 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System

Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin

Abstract:

RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.

Keywords: cluster system, modular exponentiation, sliding window, addition chain

Procedia PDF Downloads 522
813 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 18
812 Geospatial Analysis for Predicting Sinkhole Susceptibility in Greene County, Missouri

Authors: Shishay Kidanu, Abdullah Alhaj

Abstract:

Sinkholes in the karst terrain of Greene County, Missouri, pose significant geohazards, imposing challenges on construction and infrastructure development, with potential threats to lives and property. To address these issues, understanding the influencing factors and modeling sinkhole susceptibility is crucial for effective mitigation through strategic changes in land use planning and practices. This study utilizes geographic information system (GIS) software to collect and process diverse data, including topographic, geologic, hydrogeologic, and anthropogenic information. Nine key sinkhole influencing factors, ranging from slope characteristics to proximity to geological structures, were carefully analyzed. The Frequency Ratio method establishes relationships between attribute classes of these factors and sinkhole events, deriving class weights to indicate their relative importance. Weighted integration of these factors is accomplished using the Analytic Hierarchy Process (AHP) and the Weighted Linear Combination (WLC) method in a GIS environment, resulting in a comprehensive sinkhole susceptibility index (SSI) model for the study area. Employing Jenk's natural break classifier method, the SSI values are categorized into five distinct sinkhole susceptibility zones: very low, low, moderate, high, and very high. Validation of the model, conducted through the Area Under Curve (AUC) and Sinkhole Density Index (SDI) methods, demonstrates a robust correlation with sinkhole inventory data. The prediction rate curve yields an AUC value of 74%, indicating a 74% validation accuracy. The SDI result further supports the success of the sinkhole susceptibility model. This model offers reliable predictions for the future distribution of sinkholes, providing valuable insights for planners and engineers in the formulation of development plans and land-use strategies. Its application extends to enhancing preparedness and minimizing the impact of sinkhole-related geohazards on both infrastructure and the community.

Keywords: sinkhole, GIS, analytical hierarchy process, frequency ratio, susceptibility, Missouri

Procedia PDF Downloads 74
811 Plastic Behavior of Steel Frames Using Different Concentric Bracing Configurations

Authors: Madan Chandra Maurya, A. R. Dar

Abstract:

Among the entire natural calamities earthquake is the one which is most devastating. If the losses due to all other calamities are added still it will be very less than the losses due to earthquakes. So it means we must be ready to face such a situation, which is only possible if we make our structures earthquake resistant. A review of structural damages to the braced frame systems after several major earthquakes—including recent earthquakes—has identified some anticipated and unanticipated damage. This damage has prompted many engineers and researchers around the world to consider new approaches to improve the behavior of braced frame systems. Extensive experimental studies over the last fourty years of conventional buckling brace components and several braced frame specimens have been briefly reviewed, highlighting that the number of studies on the full-scale concentric braced frames is still limited. So for this reason the study surrounds the words plastic behavior, steel structure, brace frame system. In this study, there are two different analytical approaches which have been used to predict the behavior and strength of an un-braced frame. The first is referred as incremental elasto-plastic analysis a plastic approach. This method gives a complete load-deflection history of the structure until collapse. It is based on the plastic hinge concept for fully plastic cross sections in a structure under increasing proportional loading. In this, the incremental elasto-plastic analysis- hinge by hinge method is used in this study because of its simplicity to know the complete load- deformation history of two storey un-braced scaled model. After that the experiments were conducted on two storey scaled building model with and without bracing system to know the true or experimental load deformation curve of scaled model. Only way, is to understand and analyze these techniques and adopt these techniques in our structures. The study named as Plastic Behavior of Steel Frames using Different Concentric Bracing Configurations deals with all this. This study aimed at improving the already practiced traditional systems and to check the behavior and its usefulness with respect to X-braced system as reference model i.e. is how plastically it is different from X-braced. Laboratory tests involved determination of plastic behavior of these models (with and without brace) in terms of load-deformation curve. Thus, the aim of this study is to improve the lateral displacement resistance capacity by using new configuration of brace member in concentric manner which is different from conventional concentric brace. Once the experimental and manual results (using plastic approach) compared, simultaneously the results from both approach were also compared with nonlinear static analysis (pushover analysis) approach using ETABS i.e how both the previous results closely depicts the behavior in pushover curve and upto what limit. Tests results shows that all the three approaches behaves somewhat in similar manner upto yield point and also the applicability of elasto-plastic analysis (hinge by hinge method) to know the plastic behavior. Finally the outcome from three approaches shows that the newer one configuration which is chosen for study behaves in-between the plane frame (without brace or reference frame) and the conventional X-brace frame.

Keywords: elasto-plastic analysis, concentric steel braced frame, pushover analysis, ETABS

Procedia PDF Downloads 229
810 The Reliability Analysis of Concrete Chimneys Due to Random Vortex Shedding

Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta

Abstract:

Chimneys are generally tall and slender structures with circular cross-sections, due to which they are highly prone to wind forces. Wind exerts pressure on the wall of the chimneys, which produces unwanted forces. Vortex-induced oscillation is one of such excitations which can lead to the failure of the chimneys. Therefore, vortex-induced oscillation of chimneys is of great concern to researchers and practitioners since many failures of chimneys due to vortex shedding have occurred in the past. As a consequence, extensive research has taken place on the subject over decades. Many laboratory experiments have been performed to verify the theoretical models proposed to predict vortex-induced forces, including aero-elastic effects. Comparatively, very few proto-type measurement data have been recorded to verify the proposed theoretical models. Because of this reason, the theoretical models developed with the help of experimental laboratory data are utilized for analyzing the chimneys for vortex-induced forces. This calls for reliability analysis of the predictions of the responses of the chimneys produced due to vortex shedding phenomena. Although several works of literature exist on the vortex-induced oscillation of chimneys, including code provisions, the reliability analysis of chimneys against failure caused due to vortex shedding is scanty. In the present study, the reliability analysis of chimneys against vortex shedding failure is presented, assuming the uncertainty in vortex shedding phenomena to be significantly more than other uncertainties, and hence, the latter is ignored. The vortex shedding is modeled as a stationary random process and is represented by a power spectral density function (PSDF). It is assumed that the vortex shedding forces are perfectly correlated and act over the top one-third height of the chimney. The PSDF of the tip displacement of the chimney is obtained by performing a frequency domain spectral analysis using a matrix approach. For this purpose, both chimney and random wind forces are discretized over a number of points along with the height of the chimney. The method of analysis duly accounts for the aero-elastic effects. The double barrier threshold crossing level, as proposed by Vanmarcke, is used for determining the probability of crossing different threshold levels of the tip displacement of the chimney. Assuming the annual distribution of the mean wind velocity to be a Gumbel type-I distribution, the fragility curve denoting the variation of the annual probability of threshold crossing against different threshold levels of the tip displacement of the chimney is determined. The reliability estimate is derived from the fragility curve. A 210m tall concrete chimney with a base diameter of 35m, top diameter as 21m, and thickness as 0.3m has been taken as an illustrative example. The terrain condition is assumed to be that corresponding to the city center. The expression for the PSDF of the vortex shedding force is taken to be used by Vickery and Basu. The results of the study show that the threshold crossing reliability of the tip displacement of the chimney is significantly influenced by the assumed structural damping and the Gumbel distribution parameters. Further, the aero-elastic effect influences the reliability estimate to a great extent for small structural damping.

Keywords: chimney, fragility curve, reliability analysis, vortex-induced vibration

Procedia PDF Downloads 160
809 Polymer Mediated Interaction between Grafted Nanosheets

Authors: Supriya Gupta, Paresh Chokshi

Abstract:

Polymer-particle interactions can be effectively utilized to produce composites that possess physicochemical properties superior to that of neat polymer. The incorporation of fillers with dimensions comparable to polymer chain size produces composites with extra-ordinary properties owing to very high surface to volume ratio. The dispersion of nanoparticles is achieved by inducing steric repulsion realized by grafting particles with polymeric chains. A comprehensive understanding of the interparticle interaction between these functionalized nanoparticles plays an important role in the synthesis of a stable polymer nanocomposite. With the focus on incorporation of clay sheets in a polymer matrix, we theoretically construct the polymer mediated interparticle potential for two nanosheets grafted with polymeric chains. The self-consistent field theory (SCFT) is employed to obtain the inhomogeneous composition field under equilibrium. Unlike the continuum models, SCFT is built from the microscopic description taking in to account the molecular interactions contributed by both intra- and inter-chain potentials. We present the results of SCFT calculations of the interaction potential curve for two grafted nanosheets immersed in the matrix of polymeric chains of dissimilar chemistry to that of the grafted chains. The interaction potential is repulsive at short separation and shows depletion attraction for moderate separations induced by high grafting density. It is found that the strength of attraction well can be tuned by altering the compatibility between the grafted and the mobile chains. Further, we construct the interaction potential between two nanosheets grafted with diblock copolymers with one of the blocks being chemically identical to the free polymeric chains. The interplay between the enthalpic interaction between the dissimilar species and the entropy of the free chains gives rise to a rich behavior in interaction potential curve obtained for two separate cases of free chains being chemically similar to either the grafted block or the free block of the grafted diblock chains.

Keywords: clay nanosheets, polymer brush, polymer nanocomposites, self-consistent field theory

Procedia PDF Downloads 252
808 Proposals of Exposure Limits for Infrasound From Wind Turbines

Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz

Abstract:

Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).

Keywords: infrasound, exposure limit, hearing thresholds, wind turbines

Procedia PDF Downloads 83