Search results for: Advanced Encryption Standard (AES)
3667 A Computational Analysis of Flow and Acoustics around a Car Wing Mirror
Authors: Aidan J. Bowes, Reaz Hasan
Abstract:
The automotive industry is continually aiming to develop the aerodynamics of car body design. This may be for a variety of beneficial reasons such as to increase speed or fuel efficiency by reducing drag. However recently there has been a greater amount of focus on wind noise produced while driving. Designers in this industry seek a combination of both simplicity of approach and overall effectiveness. This combined with the growing availability of commercial CFD (Computational Fluid Dynamics) packages is likely to lead to an increase in the use of RANS (Reynolds Averaged Navier-Stokes) based CFD methods. This is due to these methods often being simpler than other CFD methods, having a lower demand on time and computing power. In this investigation the effectiveness of turbulent flow and acoustic noise prediction using RANS based methods has been assessed for different wing mirror geometries. Three different RANS based models were used, standard k-ε, realizable k-ε and k-ω SST. The merits and limitations of these methods are then discussed, by comparing with both experimental and numerical results found in literature. In general, flow prediction is fairly comparable to more complex LES (Large Eddy Simulation) based methods; in particular for the k-ω SST model. However acoustic noise prediction still leaves opportunities for more improvement using RANS based methods.Keywords: acoustics, aerodynamics, RANS models, turbulent flow
Procedia PDF Downloads 4513666 Evaluation of Orthodontic Patients’ Dental Visits and Problems During Covid-19 Pandemic in Sari Dental School in 2021
Authors: Mobina Bagherianlemraski, Parastoo Namdar
Abstract:
Background: The ongoing coronavirus disease has affected most countries. This virus has high transmission power. Due to the closure of most dental clinics, millions of orthodontic patients missed their appointments during the COVID-19 pandemic. Methods: A questionnaire was developed and sent to patients receiving orthodontic treatment at a public or private clinic. Results: A total of 200 responses were analyzed: These included 153 women (76.5%) and 47 men (23.5%). The mean and standard deviation of their age was 18.92±7.23 years, with an age range of 8 to 40 years. One hundred eighty-nine patients (94.5%) had fixed appliances, and 11 patients (5.5%) had removable appliances. Of all participants, 35% (70) missed their appointments. The highest and lowest reasons for stopping appointments were concerned about the spread of COVID-19 with 28 cases (40%) and the closure of the clinic with 15 cases (21.4%). Of the 53 patients who contacted their orthodontists, 86.8 % (46) communicated via office phone and 5.7% (3) through social media. Conclusion: This study determined that the coronavirus pandemic and quarantine have had an important impact on orthodontic treatments. The greatest concern of orthodontic patients was increasing in treatment duration. Patients who used fixed appliances reported missing dental appointments more than others. Therefore, during COVID 19 Pandemic, orthodontists should prepare patients to solve their problems linked to orthodontic appliances when possible.Keywords: orthodontic patients, coronavirus pandemic, appointments, COVID-19
Procedia PDF Downloads 1443665 Grassland Phenology in Different Eco-Geographic Regions over the Tibetan Plateau
Authors: Jiahua Zhang, Qing Chang, Fengmei Yao
Abstract:
Studying on the response of vegetation phenology to climate change at different temporal and spatial scales is important for understanding and predicting future terrestrial ecosystem dynamics andthe adaptation of ecosystems to global change. In this study, the Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) dataset and climate data were used to analyze the dynamics of grassland phenology as well as their correlation with climatic factors in different eco-geographic regions and elevation units across the Tibetan Plateau. The results showed that during 2003–2012, the start of the grassland greening season (SOS) appeared later while the end of the growing season (EOS) appeared earlier following the plateau’s precipitation and heat gradients from southeast to northwest. The multi-year mean value of SOS showed differences between various eco-geographic regions and was significantly impacted by average elevation and regional average precipitation during spring. Regional mean differences for EOS were mainly regulated by mean temperature during autumn. Changes in trends of SOS in the central and eastern eco-geographic regions were coupled to the mean temperature during spring, advancing by about 7d/°C. However, in the two southwestern eco-geographic regions, SOS was delayed significantly due to the impact of spring precipitation. The results also showed that the SOS occurred later with increasing elevation, as expected, with a delay rate of 0.66 d/100m. For 2003–2012, SOS showed an advancing trend in low-elevation areas, but a delayed trend in high-elevation areas, while EOS was delayed in low-elevation areas, but advanced in high-elevation areas. Grassland SOS and EOS changes may be influenced by a variety of other environmental factors in each eco-geographic region.Keywords: grassland, phenology, MODIS, eco-geographic regions, elevation, climatic factors, Tibetan Plateau
Procedia PDF Downloads 3233664 Show Products or Show Endorsers: Immersive Visual Experience in Fashion Advertisements on Instagram
Authors: H. Haryati, A. Nor Azura
Abstract:
Over the turn of the century, the advertising landscape has evolved significantly, from print media to digital media. In line with the shift to the advanced science and technology dramatically shake the framework of societies Fifth Industrial Revolution (IR5.0), technological endeavors have increased exponentially, which influenced user interaction more inspiring through online advertising that intentionally leads to buying behavior. Users are more accustomed to interactive content that responds to their actions. Thus, immersive experience has transformed into a new engagement experience To centennials. The purpose of this paper is to investigate pleasure and arousal as the fundamental elements of consumer emotions and affective responses to marketing stimuli. A quasi-experiment procedure will be adopted in the research involving 40 undergraduate students in Nilai, Malaysia. This study employed a 2 (celebrity endorser vs. Social media influencer) X 2 (high and low visual complexity) factorial between-subjects design. Participants will be exposed to a printed version depicting a fashion product endorsed by a celebrity and social media influencers, presented in high and low levels of visual complexity. While the questionnaire will be Distributing during the lab test session is used to control their honesty, real feedback, and responses through the latest Instagram design and engagement. Therefore, the research aims to define the immersive experience on Instagram and the interaction between pleasure and arousal. An advertisement that evokes pleasure and arousal will be likely getting more attention from the target audience. This is one of the few studies comparing the endorses in Instagram advertising. Also, this research extends the existing knowledge about the immersive visual complexity in the context of social media advertising.Keywords: immersive visual experience, instagram, pleasure, arousal
Procedia PDF Downloads 1883663 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 953662 Etymological Studies and their Role in Consolidating the Identity of the Cultural Heritage; Terminology Related to the Traditional Dagger Making in the Sultanate of Oman as a Model
Authors: Muhammed Muvaffak Alhasan, Ali Alriyami, Ali Almanei
Abstract:
Despite the extreme importance of etymological studies in documenting the linguistic heritage, and showing its roots and connections in the classical language; However, etymological dictionaries are still rare in the Arab library in general. Etymology is the science of etymology that investigates how vocabulary is reproduced and reproduced, by exploring the origin of words and the phonetic and semantic changes that occurred in them over time, trying to reconfigure an identity card for the word showing its origin and the path it took through time until it reached its current state. This research seeks to make an etymological study on the terminology used in the traditional dagger making in the Sultanate of Oman through the following steps: 1. Collecting the terms relating to traditional dagger making and recording them in order to document and preserve them. 2. Arranging them alphabetically in order to facilitate searching and dealing with them. 3. Setting up a historical identification card for each word by applying an etymological study that shows its source from which they descended its links with standard and the phonological and semantic changes it underwent until it reached its current form.Keywords: cultural heritage, etymology, Omani dagger, Oman
Procedia PDF Downloads 843661 The Role of Cyfra 21-1 in Diagnosing Non Small Cell Lung Cancer (NSCLC)
Authors: H. J. T. Kevin Mozes, Dyah Purnamasari
Abstract:
Background: Lung cancer accounted for the fourth most common cancer in Indonesia. 85% of all lung cancer cases are the Non-Small Cell Lung Cancer (NSCLC). The indistinct signs and symptoms of NSCLC sometimes lead to misdiagnosis. The gold standard assessment for the diagnosis of NSCLC is the histopathological biopsy, which is invasive. Cyfra 21-1 is a tumor marker, which can be found in the intermediate protein structure in the epitel. The accuracy of Cyfra 21-1 in diagnosing NSCLC is not yet known, so this report is made to seek the answer for the question above. Methods: Literature searching is done using online databases. Proquest and Pubmed are online databases being used in this report. Then, literature selection is done by excluding and including based on inclusion criterias and exclusion criterias. The selected literature is then being appraised using the criteria of validity, importance, and validity. Results: From six journals appraised, five of them are valid. Sensitivity value acquired from all five literature is ranging from 50-84.5 %, meanwhile the specificity is 87.8 %-94.4 %. Likelihood the ratio of all appraised literature is ranging from 5.09 -10.54, which categorized to Intermediate High. Conclusion: Serum Cyfra 21-1 is a sensitive and very specific tumor marker for diagnosis of non-small cell lung cancer (NSCLC).Keywords: cyfra 21-1, diagnosis, nonsmall cell lung cancer, NSCLC, tumor marker
Procedia PDF Downloads 2323660 Remote BioMonitoring of Mothers and Newborns for Temperature Surveillance Using a Smart Wearable Sensor: Techno-Feasibility Study and Clinical Trial in Southern India
Authors: Prem K. Mony, Bharadwaj Amrutur, Prashanth Thankachan, Swarnarekha Bhat, Suman Rao, Maryann Washington, Annamma Thomas, N. Sheela, Hiteshwar Rao, Sumi Antony
Abstract:
The disease burden among mothers and newborns is caused mostly by a handful of avoidable conditions occurring around the time of childbirth and within the first month following delivery. Real-time monitoring of vital parameters of mothers and neonates offers a potential opportunity to impact access as well as the quality of care in vulnerable populations. We describe the design, development and testing of an innovative wearable device for remote biomonitoring (RBM) of body temperatures in mothers and neonates in a hospital in southern India. The architecture consists of: [1] a low-cost, wearable sensor tag; [2] a gateway device for ‘real-time’ communication link; [3] piggy-backing on a commercial GSM communication network; and [4] an algorithm-based data analytics system. Requirements for the device were: long battery-life upto 28 days (with sampling frequency 5/hr); robustness; IP 68 hermetic sealing; and human-centric design. We undertook pre-clinical laboratory testing followed by clinical trial phases I & IIa for evaluation of safety and efficacy in the following sequence: seven healthy adult volunteers; 18 healthy mothers; and three sets of babies – 3 healthy babies; 10 stable babies in the Neonatal Intensive Care Unit (NICU) and 1 baby with hypoxic ischaemic encephalopathy (HIE). The 3-coin thickness, pebble-design sensor weighing about 8 gms was secured onto the abdomen for the baby and over the upper arm for adults. In the laboratory setting, the response-time of the sensor device to attain thermal equilibrium with the surroundings was 4 minutes vis-a-vis 3 minutes observed with a precision-grade digital thermometer used as a reference standard. The accuracy was ±0.1°C of the reference standard within the temperature range of 25-40°C. The adult volunteers, aged 20 to 45 years, contributed a total of 345 hours of readings over a 7-day period and the postnatal mothers provided a total of 403 paired readings. The mean skin temperatures measured in the adults by the sensor were about 2°C lower than the axillary temperature readings (sensor =34.1 vs digital = 36.1); this difference was statistically significant (t-test=13.8; p<0.001). The healthy neonates provided a total of 39 paired readings; the mean difference in temperature was 0.13°C (sensor =36.9 vs digital = 36.7; p=0.2). The neonates in the NICU provided a total of 130 paired readings. Their mean skin temperature measured by the sensor was 0.6°C lower than that measured by the radiant warmer probe (sensor =35.9 vs warmer probe = 36.5; p < 0.001). The neonate with HIE provided a total of 25 paired readings with the mean sensor reading being not different from the radian warmer probe reading (sensor =33.5 vs warmer probe = 33.5; p=0.8). No major adverse events were noted in both the adults and neonates; four adult volunteers reported mild sweating under the device/arm band and one volunteer developed mild skin allergy. This proof-of-concept study shows that real-time monitoring of temperatures is technically feasible and that this innovation appears to be promising in terms of both safety and accuracy (with appropriate calibration) for improved maternal and neonatal health.Keywords: public health, remote biomonitoring, temperature surveillance, wearable sensors, mothers and newborns
Procedia PDF Downloads 2133659 Institutional Capacity of Health Care Institutes for Diagnosis and Management of Common Genetic Diseases-a Study from a North Coastal District of Andhra Pradesh, India
Authors: Koteswara Rao Pagolu, Raghava Rao Tamanam
Abstract:
In India, genetic disease is a disregarded service element in the community health- protection system. This study aims to gauge the accessibility of services for treating genetic disorders and also to evaluate the practices on deterrence and management services in the district health system. A cross-sectional survey of selected health amenities in the government health sector was conducted from 15 primary health centers (PHC’s), 4 community health centers (CHC’s), 1 district government hospital (DGH) and 3 referral hospitals (RH’s). From these, the existing manpower like 130 medical officers (MO’s), 254 supporting staff, 409 nursing staff (NS) and 45 lab technicians (LT’s) was examined. From the side of private health institutions, 25 corporate hospitals (CH’s), 3 medical colleges (MC’s) and 25 diagnostic laboratories (DL’s) were selected for the survey and from these, 316 MO’s, 995 NS and 254 LT’s were also reviewed. The findings show that adequate staff was in place at more than 70% of health centers, but none of the staff have obtained any operative training on genetic disease management. The largest part of the DH’s had rudimentary infrastructural and diagnostic facilities. However, the greater part of the CHC’s and PHC’s had inadequate diagnostic facilities related to genetic disease management. Biochemical, molecular, and cytogenetic services were not available at PHC’s and CHC’s. DH’s, RH’s, and all selected medical colleges were found to have offered the basic Biochemical genetics units during the survey. The district health care infrastructure in India has a shortage of basic services to be provided for the genetic disorder. With some policy resolutions and facility strengthening, it is possible to provide advanced services for a genetic disorder in the district health system.Keywords: district health system, genetic disorder, infrastructural amenities, management practices
Procedia PDF Downloads 1853658 A Study on Golden Ratio (ф) and Its Implications on Seismic Design Using ETABS
Authors: Vishal A. S. Salelkar, Sumitra S. Kandolkar
Abstract:
Golden ratio (ф) or Golden mean or Golden section, as it is often referred to, is a proportion or a mean, which is often used by architects while conceiving the aesthetics of a structure. Golden Ratio (ф) is an irrational number that can be roughly rounded to 1.618 and is derived out of quadratic equation x2-x-1=0. The use of Golden Ratio (ф) can be observed throughout history, as far as ancient Egyptians, which later peaked during the Greek golden age. The use of this design technique is very much prevalent. At present, architects around the world prefer this as one of the primary techniques to decide aesthetics. In this study, an analysis has been performed to investigate whether the use of the golden ratio while planning a structure has any effects on the seismic behavior of the structure. The structure is modeled and analyzed on ETABS (by Computers and Structures, Inc.) for Seismic requirements equivalent to Zone III (Region: Goa-India) as per Indian Standard Code IS-1893. The results were compared to that of an identical structure modeled along the lines of normal design philosophy, not using the Golden Ratio tools. The results were then compared for Story Shear, Story Drift, and Story Displacement Readings. Improvement in performance, although slight, but was observed. Similar improvements were also observed in subsequent iterations, performed using time-acceleration data of previous major earthquakes matched to Zone III as per IS-1893.Keywords: ETABS, golden ratio, seismic design, structural behavior
Procedia PDF Downloads 1873657 Causes of Terrorism: Perceptions of University Students of Teacher Training Institutions
Authors: Saghir Ahmad, Abid Hussain Ch, Misbah Malik, Ayesha Batool
Abstract:
Terrorism is the marvel in which dreadful circumstance is made by a gathering of individuals who view themselves as abused by society. Terrorism is the unlawful utilization of power or viciousness by a man or a sorted out gathering by the general population or property with the aim of intimidation or compulsion of social orders or governments frequently for ideological or political reasons. Terrorism is as old as people. The main aim of the study was to find out the causes of terrorism through the perceptions of the universities students of teacher training institutions. This study was quantitative in nature. Survey method was used to collect data. A sample of two hundred and sixty seven students was selected from public universities. A five point Likert scale was used to collect data. Mean, Standard deviation, independent sample t-test, and One Way ANOVA were applied to analyze the data. The major findings of the study indicated that students perceived the main causes of terrorism are poverty, foreign interference, wrong concept of Islamization, and social injustice. It is also concluded that mostly, students think that drone attacks are promoting the terrorist activities. The education is key to eliminate the terrorism. There is need to educate the people and specially youngsters to bring the peace in the world.Keywords: dreadful circumstance, governments, power, students, terrorism
Procedia PDF Downloads 5563656 Design and Implement a Remote Control Robot Controlled by Zigbee Wireless Network
Authors: Sinan Alsaadi, Mustafa Merdan
Abstract:
Communication and access systems can be made with many methods in today’s world. These systems are such standards as Wifi, Wimax, Bluetooth, GPS and GPRS. Devices which use these standards also use system resources excessively in direct proportion to their transmission speed. However, large-scale data communication is not always needed. In such cases, a technology which will use system resources as little as possible and support smart network topologies has been needed in order to enable the transmissions of such small packet data and provide the control for this kind of devices. IEEE issued 802.15.4 standard upon this necessity and enabled the production of Zigbee protocol which takes these standards as its basis and devices which support this protocol. In our project, this communication protocol was preferred. The aim of this study is to provide the immediate data transmission of our robot from the field within the scope of the project. In addition, making the communication with the robot through Zigbee Protocol has also been aimed. While sitting on the computer, obtaining the desired data from the region where the robot is located has been taken as the basis. Arduino Uno R3 microcontroller which provides the control mechanism, 1298 shield as the motor driver.Keywords: ZigBee, wireless network, remote monitoring, smart home, agricultural industry
Procedia PDF Downloads 2813655 Consumer Ethnocentrism: A Dynamic Literature Review from 1987-2015
Authors: Thi Phuong Chi Nguyen
Abstract:
Although consumer ethnocentrism has been widely studied in academic research since 1987, somehow it is still considered as a new and unknown concept in marketing theory and practice. By analyzing the content, three mainstreams of consumer ethnocentrism were found including economic, management and marketing approaches. The present study indicated that the link between consumer ethnocentrism and consumer behaviours varies across countries. Consumers in developing countries might be both patriotic about their home countries and curious about foreign cultures at the same time. The most important finding is identifying three main periods in the chronological development of consumer ethnocentrism research. The first period, spanning from 1987 to 1995, was characterized by the introduction of the consumer ethnocentrism concepts and scales, the unidimensionality and the adaptation of the standard CETSCALE version. The second period 1996-2005 witnessed the replication of CETSCALE in various fields, as well as an increase in the volume of researches in developing and emerging countries; the exploration of determinants and the begin of multidimensionality. In the third period from 2006 to present, all variables related to CET were syntherized within the theory of planne behavior. Consumer ethnocentrism analyses were conducted even in less-developed countries and in groups of countries within longitudinal studies. The results from this study showed many inadequacies relating to consumer ethnocentrism in the context of globalisation for further researches to examine.Keywords: CETSCALE, consumer behavior, consumer ethnocentrism, business, marketing
Procedia PDF Downloads 4383654 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach
Authors: Utkarsh A. Mishra, Ankit Bansal
Abstract:
At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks
Procedia PDF Downloads 2283653 Evaluation of Fluidized Bed Bioreactor Process for Mmabatho Waste Water Treatment Plant
Authors: Shohreh Azizi, Wag Nel
Abstract:
The rapid population growth in South Africa has increased the requirement of waste water treatment facilities. The aim of this study is to assess the potential use of Fluidized bed Bio Reactor for Mmabatho sewage treatment plant. The samples were collected from the Inlet and Outlet of reactor daily to analysis the pH, Chemical Oxygen Demand (COD), Biochemical Oxygen Demand (BOD), Total Suspended Solid (TSS) as per standard method APHA 2005. The studies were undertaken on a continue laboratory scale, and analytical data was collected before and after treatment. The reduction of 87.22 % COD, 89.80 BOD % was achieved. Fluidized Bed Bio Reactor remove Bod/COD removal as well as nutrient removal. The efforts also made to study the impact of the biological system if the domestic wastewater gets contaminated with any industrial contamination and the result shows that the biological system can tolerate high Total dissolved solids up to 6000 mg/L as well as high heavy metal concentration up to 4 mg/L. The data obtained through the experimental research are demonstrated that the FBBR may be used (<3 h total Hydraulic Retention Time) for secondary treatment in Mmabatho wastewater treatment plant.Keywords: fluidized bed bioreactor, wastewater treatment plant, biological system, high TDS, heavy metal
Procedia PDF Downloads 1703652 Metagenomics Features of The Gut Microbiota in Metabolic Syndrome
Authors: Anna D. Kotrova, Alexandr N. Shishkin, Elena I. Ermolenko
Abstract:
The aim. To study the quantitative and qualitative colon bacteria ratio from patients with metabolic syndrome. Materials and methods. Fecal samples from patients of 2 groups were identified and analyzed: the first group was formed by patients with metabolic syndrome, the second one - by healthy individuals. The metagenomics method was used with the analysis of 16S rRNA gene sequences. The libraries of the variable sites (V3 and V4) gene 16S RNA were analyzed using the MiSeq device (Illumina). To prepare the libraries was used the standard recommended by Illumina, a method based on two rounds of PCR. Results. At the phylum level in the microbiota of patients with metabolic syndrome compared to healthy individuals, the proportion of Tenericutes was reduced, the proportion of Actinobacteria was increased. At the genus level, in the group with metabolic syndrome, relative to the second group was increased the proportion of Lachnospira. Conclusion. Changes in the colon bacteria ratio in the gut microbiota of patients with metabolic syndrome were found both at the type and the genus level. In the metabolic syndrome group, there is a decrease in the proportion of bacteria that do not have a cell wall. To confirm the revealed microbiota features in patients with metabolic syndrome, further study with a larger number of samples is required.Keywords: gut microbiota, metabolic syndrome, metagenomics, tenericutes
Procedia PDF Downloads 2273651 Unlocking Justice: Exploring the Power and Challenges of DNA Analysis in the Criminal Justice System
Authors: Sandhra M. Pillai
Abstract:
This article examines the relevance, difficulties, and potential applications of DNA analysis in the criminal justice system. A potent tool for connecting suspects to crime sites, clearing the innocent of wrongdoing, and resolving cold cases, DNA analysis has transformed forensic investigations. The scientific foundations of DNA analysis, including DNA extraction, sequencing, and statistical analysis, are covered in the article. To guarantee accurate and trustworthy findings, it also discusses the significance of quality assurance procedures, chain of custody, and DNA sample storage. DNA analysis has significantly advanced science, but it also brings up substantial moral and legal issues. To safeguard individual rights and uphold public confidence, privacy concerns, possible discrimination, and abuse of DNA information must be properly addressed. The paper also emphasises the effects of the criminal justice system on people and communities while highlighting the necessity of equity, openness, and fair access to DNA testing. The essay describes the obstacles and future directions for DNA analysis. It looks at cutting-edge technology like next-generation sequencing, which promises to make DNA analysis quicker and more affordable. To secure the appropriate and informed use of DNA evidence, it also emphasises the significance of multidisciplinary collaboration among scientists, law enforcement organisations, legal experts, and policymakers. In conclusion, DNA analysis has enormous potential for improving the course of criminal justice. We can exploit the potential of DNA technology while respecting the ideals of justice, fairness, and individual rights by navigating the ethical, legal, and societal issues and encouraging discussion and collaboration.Keywords: DNA analysis, DNA evidence, reliability, validity, legal frame, admissibility, ethical considerations, impact, future direction, challenges
Procedia PDF Downloads 703650 Least Squares Solution for Linear Quadratic Gaussian Problem with Stochastic Approximation Approach
Authors: Sie Long Kek, Wah June Leong, Kok Lay Teo
Abstract:
Linear quadratic Gaussian model is a standard mathematical model for the stochastic optimal control problem. The combination of the linear quadratic estimation and the linear quadratic regulator allows the state estimation and the optimal control policy to be designed separately. This is known as the separation principle. In this paper, an efficient computational method is proposed to solve the linear quadratic Gaussian problem. In our approach, the Hamiltonian function is defined, and the necessary conditions are derived. In addition to this, the output error is defined and the least-square optimization problem is introduced. By determining the first-order necessary condition, the gradient of the sum squares of output error is established. On this point of view, the stochastic approximation approach is employed such that the optimal control policy is updated. Within a given tolerance, the iteration procedure would be stopped and the optimal solution of the linear-quadratic Gaussian problem is obtained. For illustration, an example of the linear-quadratic Gaussian problem is studied. The result shows the efficiency of the approach proposed. In conclusion, the applicability of the approach proposed for solving the linear quadratic Gaussian problem is highly demonstrated.Keywords: iteration procedure, least squares solution, linear quadratic Gaussian, output error, stochastic approximation
Procedia PDF Downloads 1943649 Economic Expansion and Land Use Change in Thailand: An Environmental Impact Analysis Using Computable General Equilibrium Model
Authors: Supakij Saisopon
Abstract:
The process of economic development incurs spatial transformation. This spatial alternation also causes environmental impacts, leading to higher pollution. In the case of Thailand, there is still a lack of price-endogenous quantitative analysis incorporating relationships among economic growth, land-use change, and environmental impact. Therefore, this paper aimed at developing the Computable General Equilibrium (CGE) model with the capability of stimulating such mutual effects. The developed CGE model has also incorporated the nested constant elasticity of transformation (CET) structure that describes the spatial redistribution mechanism between agricultural land and urban area. The simulation results showed that the 1% decrease in the availability of agricultural land lowers the value-added of agricultural by 0.036%. Similarly, the 1% reduction of availability of urban areas can decrease the value-added of manufacturing and service sectors by 0.05% and 0.047%, respectively. Moreover, the outcomes indicate that the increasing farming and urban areas induce higher volumes of solid waste, wastewater, and air pollution. Specifically, the 1% increase in the urban area can increase pollution as follows: (1) the solid waste increase by 0.049%, (2) water pollution ̶ indicated by biochemical oxygen demand (BOD) value ̶ increase by 0.051% and (3) air pollution ̶ indicated by the volumes of CO₂, N₂O, NOₓ, CH₄, and SO₂ ̶ increase within the range of 0.045%–0.051%. With the simulation for exploring the sustainable development path, a 1% increase in agricultural land use efficiency leads to the shrinking demand for agricultural land. But this is not happening in urban, a 1% scale increase in urban utilization results in still increasing demand for land. Therefore, advanced clean production technology is necessary to align the increasing land-use efficiency with the lowered pollution density.Keywords: CGE model, CET structure, environmental impact, land use
Procedia PDF Downloads 2353648 DNpro: A Deep Learning Network Approach to Predicting Protein Stability Changes Induced by Single-Site Mutations
Authors: Xiao Zhou, Jianlin Cheng
Abstract:
A single amino acid mutation can have a significant impact on the stability of protein structure. Thus, the prediction of protein stability change induced by single site mutations is critical and useful for studying protein function and structure. Here, we presented a deep learning network with the dropout technique for predicting protein stability changes upon single amino acid substitution. While using only protein sequence as input, the overall prediction accuracy of the method on a standard benchmark is >85%, which is higher than existing sequence-based methods and is comparable to the methods that use not only protein sequence but also tertiary structure, pH value and temperature. The results demonstrate that deep learning is a promising technique for protein stability prediction. The good performance of this sequence-based method makes it a valuable tool for predicting the impact of mutations on most proteins whose experimental structures are not available. Both the downloadable software package and the user-friendly web server (DNpro) that implement the method for predicting protein stability changes induced by amino acid mutations are freely available for the community to use.Keywords: bioinformatics, deep learning, protein stability prediction, biological data mining
Procedia PDF Downloads 4743647 NABERS Indoor Environment - a Rating Tool to Benchmark the IEQ of Australian Office Commercial Buildings
Authors: Kazi Hossain
Abstract:
The National Australian Built Environment Rating System (NABERS) is the key industry standard for measuring and benchmarking environmental performance of existing buildings in Australia. Developed and run by the New South Wales government, NABERS measures the operational efficiency of different types of buildings by using a set of tools that provide an easy to understand graphical rating outcome ranged from 0 to 6 stars. This set of tools also include a tool called NABERS IE which enables tenants or building managers to benchmark their buildings indoor environment quality against the national market. Launched in 2009, the number NABERS IE ratings have steadily increased from 10 certified ratings in 2011 to 43 in 2013. However there is a massive uptake of over 50 ratings alone in 2014 making the number of ratings to reach over 100. This paper outlines the methodology used to create this tool, a statistical overview of the tool, and the driving factor that motivates the building owners and managers to use this tool every year to rate their buildings.Keywords: Acoustic comfort, Indoor air quality, Indoor Environment, NABERS, National Australian Built Environment Rating System, Performance rating, Rating System, Thermal comfort, Ventilation effectiveness, Visual comfort.
Procedia PDF Downloads 5643646 Efficacy of Learning: Digital Sources versus Print
Authors: Rahimah Akbar, Abdullah Al-Hashemi, Hanan Taqi, Taiba Sadeq
Abstract:
As technology continues to develop, teaching curriculums in both schools and universities have begun adopting a more computer/digital based approach to the transmission of knowledge and information, as opposed to the more old-fashioned use of textbooks. This gives rise to the question: Are there any differences in learning from a digital source over learning from a printed source, as in from a textbook? More specifically, which medium of information results in better long-term retention? A review of the confounding factors implicated in understanding the relationship between learning from the two different mediums was done. Alongside this, a 4-week cohort study involving 76 1st year English Language female students was performed, whereby the participants were divided into 2 groups. Group A studied material from a paper source (referred to as the Print Medium), and Group B studied material from a digital source (Digital Medium). The dependent variables were grading of memory recall indexed by a 4 point grading system, and total frequency of item repetition. The study was facilitated by advanced computer software called Super Memo. Results showed that, contrary to prevailing evidence, the Digital Medium group showed no statistically significant differences in terms of the shift from Remember (Episodic) to Know (Semantic) when all confounding factors were accounted for. The shift from Random Guess and Familiar to Remember occurred faster in the Digital Medium than it did in the Print Medium.Keywords: digital medium, print medium, long-term memory recall, episodic memory, semantic memory, super memo, forgetting index, frequency of repetitions, total time spent
Procedia PDF Downloads 2953645 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500
Authors: Mustafa Elfituri, Jonathan Cook
Abstract:
Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.
Procedia PDF Downloads 1533644 Municipal Solid Waste Generation Trend in the Metropolitan Cities of the Muslim World
Authors: Farzaneh Fakheri Raof, Abdolkhalegh vadian
Abstract:
One of the most important environmental issues in developing countries is municipal solid waste management. In this context, knowledge of the quantity and composition of solid waste provides the basic information for the optimal management of solid waste. Many studies have been conducted to investigate the impact of economic, social and cultural factors on generation trend of solid waste, however, few of these have addressed the role of religion in the matter. The present study is a field investigation on generation trend of solid waste in Mashhad, a metropolitan city in northeastern Iran. Accordingly, the religious rituals, quantity and composition of municipal solid waste were considered as independent and dependent variables, respectively. For this purpose, the quantity of the solid waste was initially determined. Afterwards, they were classified into 12 groups using the relevant standard methods. The results showed that the production rate of the municipal solid waste was 1,507 tons per day. Composing 65.2% of the whole; the organic materials constitute the largest share of the total municipal solid waste in Mashhad. The obtained results also revealed that there is a positive relationship between waste generation and the months of religious ceremonies so that the greatest amount of waste generated in the city was reported from Ramadan (as a religious month) in a way that it was significantly different from other months.Keywords: Mashhad, municipal solid waste, religious months, waste composition, organic waste
Procedia PDF Downloads 5173643 Digital Retinal Images: Background and Damaged Areas Segmentation
Authors: Eman A. Gani, Loay E. George, Faisel G. Mohammed, Kamal H. Sager
Abstract:
Digital retinal images are more appropriate for automatic screening of diabetic retinopathy systems. Unfortunately, a significant percentage of these images are poor quality that hinders further analysis due to many factors (such as patient movement, inadequate or non-uniform illumination, acquisition angle and retinal pigmentation). The retinal images of poor quality need to be enhanced before the extraction of features and abnormalities. So, the segmentation of retinal image is essential for this purpose, the segmentation is employed to smooth and strengthen image by separating the background and damaged areas from the overall image thus resulting in retinal image enhancement and less processing time. In this paper, methods for segmenting colored retinal image are proposed to improve the quality of retinal image diagnosis. The methods generate two segmentation masks; i.e., background segmentation mask for extracting the background area and poor quality mask for removing the noisy areas from the retinal image. The standard retinal image databases DIARETDB0, DIARETDB1, STARE, DRIVE and some images obtained from ophthalmologists have been used to test the validation of the proposed segmentation technique. Experimental results indicate the introduced methods are effective and can lead to high segmentation accuracy.Keywords: retinal images, fundus images, diabetic retinopathy, background segmentation, damaged areas segmentation
Procedia PDF Downloads 4063642 Urban Sustainability and Sustainable Mobility, Lessons Learned from the Case of Chile
Authors: Jorge Urrutia-Mosquera, Luz Flórez-Calderón, Yasna Cortés
Abstract:
We assessed the state of progress in terms of urban sustainability indicators and studied the impact of current land use conditions and the level of spatial accessibility to basic urban amenities on travel patterns and sustainable mobility in Santiago de Chile. We determined the spatial impact of urban facilities on sustainable travel patterns through the statistical analysis, data visualisation, and weighted regression models. The results show a need to diversify land use in more than 60% of the communes, although in 85% of the communes, accessibility to public spaces is guaranteed. The findings also suggest improving access to early education facilities, as only 26% of the communes meet the sustainability standard, negatively impacting travel in sustainable modes. It is also observed that the level of access to urban facilities generates spatial heterogeneity in the city, which negatively affects travel patterns in terms of time over 60 minutes and modes of travel in private vehicles. The results obtained allow us to identify opportunities for public policy intervention to promote and adopt sustainable mobility.Keywords: land use, urban sustainability, travel patterns, spatial heterogeneity, GWR model, sustainable mobility
Procedia PDF Downloads 873641 Establishment of Virtual Fracture Clinic in Princess Royal Hospital Telford: Experience and Recommendations during the First 9 Months
Authors: Tahir Khaleeq, Patrick Lancaster, Keji Fakoya, Pedro Ferreira, Usman Ahmed
Abstract:
Introduction: Virtual fracture clinics (VFC) have been shown to be a safe and cost-effective way of managing outpatient referrals to the orthopaedic department. During the coronavirus pandemic there has been a push to reduce unnecessary patient contact whilst maintaining patient safety. Materials and Methods: A protocol was developed by the clinical team in collaboration with Advanced Physiotherapy Practitioners (APP) on how to manage common musculoskeletal presentations to A&E prior to COVID as part of routine service development. Patients broadly triaged into 4 categories; discharge with advice, referral to VFC, referral to face to face clinic or discussion with on call team. The first 9 months of data were analysed to assess types of injury seen and outcomes. Results: In total 2489 patients were referred to VFC from internal and external sources. 734 patients were discharged without follow-up and 182 patients were discharged for physiotherapy review. Only 3 patients required admission. Regarding follow-ups, 431 patients had a virtual follow-up while 1036 of patients required further face to face follow up. 87 patients were triaged into subspecialty clinics. 37 patients were felt to have been referred inappropriately. Discussion: BOA guidelines suggest all patients need to be reviewed within 72 hours of their orthopaedic injury. Implementation of a VFC allows this target to be achieved and at the same time reduce patient contact. Almost half the patients were discharged following VFC review, the remaining patients were appropriately followed up. This is especially relevant in the current pandemic where reducing unnecessary trips to hospital will benefit the patient as well as make the most of the resources available.Keywords: virtual fracture clinic, lockdown, trauma and orthopaedics, Covid- 19
Procedia PDF Downloads 2023640 Existence and Stability of Periodic Traveling Waves in a Bistable Excitable System
Authors: M. Osman Gani, M. Ferdows, Toshiyuki Ogawa
Abstract:
In this work, we proposed a modified FHN-type reaction-diffusion system for a bistable excitable system by adding a scaled function obtained from a given function. We study the existence and the stability of the periodic traveling waves (or wavetrains) for the FitzHugh-Nagumo (FHN) system and the modified one and compare the results. The stability results of the periodic traveling waves (PTWs) indicate that most of the solutions in the fast family of the PTWs are stable for the FitzHugh-Nagumo equations. The instability occurs only in the waves having smaller periods. However, the smaller period waves are always unstable. The fast family with sufficiently large periods is always stable in FHN model. We find that the oscillation of pulse widths is absent in the standard FHN model. That motivates us to study the PTWs in the proposed FHN-type reaction-diffusion system for the bistable excitable media. A good agreement is found between the solutions of the traveling wave ODEs and the corresponding whole PDE simulation.Keywords: bistable system, Eckhaus bifurcation, excitable media, FitzHugh-Nagumo model, periodic traveling waves
Procedia PDF Downloads 1883639 The Relationship between Competency-Based Learning and Learning Efficiency of Media Communication Students at Suan Sunandha Rajabhat University
Authors: Somtop Keawchuer
Abstract:
This research aims to study (1) the relationship between competency-based learning and learning efficiency of new media communication students at Suan Sunandha University (2) the demographic factor effect on learning efficiency of students at Suan Sunandha University. This research method will use quantitative research; data was collected by questionnaires distributed to students from new media communication in management science faculty of Suan Sunandha Rajabhat University for 1340 sample by purposive sampling method. Data was analyzed by descriptive statistic including percentage, mean, standard deviation and inferential statistic including T-test, ANOVA and Pearson correlation for hypothesis testing. The results showed that the competency-based learning in term of ability to communicate, ability to think and solve the problem, life skills and ability to use technology has a significant relationship with learning efficiency in term of the cognitive domain, psychomotor domain and affective domain at the 0.05 level and which is in harmony with the research hypotheses.Keywords: competency-based learning, learning efficiency, new media communication students, Suan Sunandha Rajabhat University
Procedia PDF Downloads 2473638 Bioefficacy of Novel Insecticide Flupyradifurone Sl 200 against Leaf Hoppers, Aphids and Whitefly in Cotton
Authors: N. V. V. S. D. Prasad
Abstract:
Field experiments were conducted at Regional Agricultural Research Station, Lam, Guntur, Andhra Pradesh, India for two seasons during 2011-13 to evaluate the efficacy of flupyradifurone SL 200 a new class of insecticide in butenolide group against leaf hoppers, aphids and whitefly in Cotton. The test insecticide flupyradifurone 200 was evaluated at three doses @ 150, 200 and 250 g ai/ha ha along with imidacloprid 200 SL @ 20g ai/ha, acetamiprid 20 SP @ 20g ai/ha, thiamethoxam 25 WG @ 25g ai/ha and monocrotophos 36 SL @ 360 g ai/ha as standards. Flupyradifurone SL 200 even at lower dose of 150g ai/ha exhibited superior efficacy against cotton leafhopper, Amrasca devastans than the neonicotinoids which are widely used for control of sucking pests in cotton. Against cotton aphids, Aphis gossypii. Flupyradifurone SL 200 @ 200 and 250 g ai/ha ha was proved to be effective and the lower dose @ 150g ai/ha performed better than some of the neonicotinoids. The effect of flupyradifurone SL 200 on cotton against whitefly, Bemisia tabaci was evident at higher doses of 200 and 250 g ai/ha and superior to all standard treatments, however, the lower dose is at par with neonicotinoids. The seed cotton yield of flupyradifurone 200 SL at all the doses tested was superior than imidacloprid 200 SL @ 20g ai/ha and acetamiprid 20 SP @ 20g ai/ha. There is no significant difference among the insecticidal treatments with regards to natural enemies. The results clearly suggest that flupyradifurone is a new tool to combat sucking pest problems in cotton and can well fit in IRM strategies in light of wide spread insecticide resistance in cotton sucking pests.Keywords: cotton, flupyradifurone, neonicotinoids, sucking pests
Procedia PDF Downloads 196