Search results for: software receiver
3786 Beam, Column Joints Concrete in Seismic Zone
Authors: Khalifa Kherafa
Abstract:
This east project consists in studying beam–column joints concrete subjected to seismic loads. A bibliographical study was introduced to clarify the work undertaken by the researchers in the field during the three last decades and especially the two last year’s results which were to study for the determination of the method of calculating of transverse reinforcement in the various nodes of a structure. For application, the efforts in the posts el the beams of a building in R+4 in zone 3 were calculate according to the finite element method through the softwareKeywords: beam–column joints, cyclic loading, shearing force, damaged joint
Procedia PDF Downloads 4263785 Fatigue of Multiscale Nanoreinforced Composites: 3D Modelling
Authors: Leon Mishnaevsky Jr., Gaoming Dai
Abstract:
3D numerical simulations of fatigue damage of multiscale fiber reinforced polymer composites with secondary nanoclay reinforcement are carried out. Macro-micro FE models of the multiscale composites are generated automatically using Python based software. The effect of the nanoclay reinforcement (localized in the fiber/matrix interface (fiber sizing) and distributed throughout the matrix) on the crack path, damage mechanisms and fatigue behavior is investigated in numerical experiments.Keywords: computational mechanics, fatigue, nanocomposites, composites
Procedia PDF Downloads 6073784 Discovering Word-Class Deficits in Persons with Aphasia
Authors: Yashaswini Channabasavegowda, Hema Nagaraj
Abstract:
Aim: The current study aims at discovering word-class deficits concerning the noun-verb ratio in confrontation naming, picture description, and picture-word matching tasks. A total of ten persons with aphasia (PWA) and ten age-matched neurotypical individuals (NTI) were recruited for the study. The research includes both behavioural and objective measures to assess the word class deficits in PWA. Objective: The main objective of the research is to identify word class deficits seen in persons with aphasia, using various speech eliciting tasks. Method: The study was conducted in the L1 of the participants, considered to be Kannada. Action naming test and Boston naming test adapted to the Kannada version are administered to the participants; also, a picture description task is carried out. Picture-word matching task was carried out using e-prime software (version 2) to measure the accuracy and reaction time with respect to identification verbs and nouns. The stimulus was presented through auditory and visual modes. Data were analysed to identify errors noticed in the naming of nouns versus verbs, with respect to the Boston naming test and action naming test and also usage of nouns and verbs in the picture description task. Reaction time and accuracy for picture-word matching were extracted from the software. Results: PWA showed a significant difference in sentence structure compared to age-matched NTI. Also, PWA showed impairment in syntactic measures in the picture description task, with fewer correct grammatical sentences and fewer correct usage of verbs and nouns, and they produced a greater proportion of nouns compared to verbs. PWA had poorer accuracy and lesser reaction time in the picture-word matching task compared to NTI, and accuracy was higher for nouns compared to verbs in PWA. The deficits were noticed irrespective of the cause leading to aphasia.Keywords: nouns, verbs, aphasia, naming, description
Procedia PDF Downloads 1023783 AI for Efficient Geothermal Exploration and Utilization
Authors: Velimir Monty Vesselinov, Trais Kliplhuis, Hope Jasperson
Abstract:
Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal
Procedia PDF Downloads 533782 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network
Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour
Abstract:
Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network
Procedia PDF Downloads 1693781 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 1953780 Performance of High Efficiency Video Codec over Wireless Channels
Authors: Mohd Ayyub Khan, Nadeem Akhtar
Abstract:
Due to recent advances in wireless communication technologies and hand-held devices, there is a huge demand for video-based applications such as video surveillance, video conferencing, remote surgery, Digital Video Broadcast (DVB), IPTV, online learning courses, YouTube, WhatsApp, Instagram, Facebook, Interactive Video Games. However, the raw videos posses very high bandwidth which makes the compression a must before its transmission over the wireless channels. The High Efficiency Video Codec (HEVC) (also called H.265) is latest state-of-the-art video coding standard developed by the Joint effort of ITU-T and ISO/IEC teams. HEVC is targeted for high resolution videos such as 4K or 8K resolutions that can fulfil the recent demands for video services. The compression ratio achieved by the HEVC is twice as compared to its predecessor H.264/AVC for same quality level. The compression efficiency is generally increased by removing more correlation between the frames/pixels using complex techniques such as extensive intra and inter prediction techniques. As more correlation is removed, the chances of interdependency among coded bits increases. Thus, bit errors may have large effect on the reconstructed video. Sometimes even single bit error can lead to catastrophic failure of the reconstructed video. In this paper, we study the performance of HEVC bitstream over additive white Gaussian noise (AWGN) channel. Moreover, HEVC over Quadrature Amplitude Modulation (QAM) combined with forward error correction (FEC) schemes are also explored over the noisy channel. The video will be encoded using HEVC, and the coded bitstream is channel coded to provide some redundancies. The channel coded bitstream is then modulated using QAM and transmitted over AWGN channel. At the receiver, the symbols are demodulated and channel decoded to obtain the video bitstream. The bitstream is then used to reconstruct the video using HEVC decoder. It is observed that as the signal to noise ratio of channel is decreased the quality of the reconstructed video decreases drastically. Using proper FEC codes, the quality of the video can be restored up to certain extent. Thus, the performance analysis of HEVC presented in this paper may assist in designing the optimized code rate of FEC such that the quality of the reconstructed video is maximized over wireless channels.Keywords: AWGN, forward error correction, HEVC, video coding, QAM
Procedia PDF Downloads 1493779 Random Variation of Treated Volumes in Fractionated 2D Image Based HDR Brachytherapy for Cervical Cancer
Authors: R. Tudugala, B. M. A. I. Balasooriya, W. M. Ediri Arachchi, R. W. M. W. K. Rathnayake, T. D. Premaratna
Abstract:
Brachytherapy involves placing a source of radiation near the cancer site which gives promising prognosis for cervical cancer treatments. The purpose of this study was to evaluate the effect of random variation of treated volumes in between fractions in the 2D image based fractionated high dose rate brachytherapy for cervical cancer at National Cancer Institute Maharagama, Sri Lanka. Dose plans were analyzed for 150 cervical cancer patients with orthogonal radiographs (2D) based brachytherapy. ICRU treated volumes was modeled by translating the applicators with the help of “Multisource HDR plus software”. The difference of treated volumes with respect to the applicator geometry was analyzed by using SPSS 18 software; to derived patient population based estimates of delivered treated volumes relative to ideally treated volumes. Packing was evaluated according to bladder dose, rectum dose and geometry of the dose distribution by three consultant radiation oncologist. The difference of treated volumes depends on types of the applicators, which was used in fractionated brachytherapy. The means of the “Difference of Treated Volume” (DTV) for “Evenly activated tandem (ET)” length” group was ((X_1)) -0.48 cm3 and ((X_2)) 11.85 cm3 for “Unevenly activated tandem length (UET) group. The range of the DTV for ET group was 35.80 cm3 whereas UET group 104.80 cm3. One sample T test was performed to compare the DTV with “Ideal treatment volume difference (0.00cm3)”. It is evident that P value was 0.732 for ET group and for UET it was 0.00 moreover independent two sample T test was performed to compare ET and UET groups and calculated P value was 0.005. Packing was evaluated under three categories 59.38% used “Convenient Packing Technique”, 33.33% used “Fairly Packing Technique” and 7.29% used “Not Convenient Packing” in their fractionated brachytherapy treatments. Random variation of treated volume in ET group is much lower than UET group and there is a significant difference (p<0.05) in between ET and UET groups which affects the dose distribution of the treatment. Furthermore, it can be concluded nearly 92.71% patient’s packing were used acceptable packing technique at NCIM, Sri Lanka.Keywords: brachytherapy, cervical cancer, high dose rate, tandem, treated volumes
Procedia PDF Downloads 2013778 The Digital Transformation of Life Insurance Sales in Iran With the Emergence of Personal Financial Planning Robots; Opportunities and Challenges
Authors: Pedram Saadati, Zahra Nazari
Abstract:
Anticipating and identifying future opportunities and challenges facing industry activists for the emergence and entry of new knowledge and technologies of personal financial planning, and providing practical solutions is one of the goals of this research. For this purpose, a future research tool based on receiving opinions from the main players of the insurance industry has been used. The research method in this study was in 4 stages; including 1- a survey of the specialist salesforce of life insurance in order to identify the variables 2- the ranking of the variables by experts selected by a researcher-made questionnaire 3- holding a panel of experts with the aim of understanding the mutual effects of the variables and 4- statistical analyzes of the mutual effects matrix in Mick Mac software is done. The integrated analysis of influencing variables in the future has been done with the method of Structural Analysis, which is one of the efficient and innovative methods of future research. A list of opportunities and challenges was identified through a survey of best-selling life insurance representatives who were selected by snowball sampling. In order to prioritize and identify the most important issues, all the issues raised were sent to selected experts who were selected theoretically through a researcher-made questionnaire. The respondents determined the importance of 36 variables through scoring, so that the prioritization of opportunity and challenge variables can be determined. 8 of the variables identified in the first stage were removed by selected experts, and finally, the number of variables that could be examined in the third stage became 28 variables, which, in order to facilitate the examination, were divided into 6 categories, respectively, 11 variables of organization and management. Marketing and sales 7 cases, social and cultural 6 cases, technological 2 cases, rebranding 1 case and insurance 1 case were divided. The reliability of the researcher-made questionnaire was confirmed with the Cronbach's alpha test value of 0.96. In the third stage, by forming a panel consisting of 5 insurance industry experts, the consensus of their opinions about the influence of factors on each other and the ranking of variables was entered into the matrix. The matrix included the interrelationships of 28 variables, which were investigated using the structural analysis method. By analyzing the data obtained from the matrix by Mic Mac software, the findings of the research indicate that the categories of "correct training in the use of the software, the weakness of the technology of insurance companies in personalizing products, using the approach of equipping the customer, and honesty in declaring no need Customer to Insurance", the most important challenges of the influencer and the categories of "salesforce equipping approach, product personalization based on customer needs assessment, customer's pleasant experience of being consulted with consulting robots, business improvement of the insurance company due to the use of these tools, increasing the efficiency of the issuance process and optimal customer purchase" were identified as the most important opportunities for influence.Keywords: personal financial planning, wealth management, advisor robots, life insurance, digital transformation
Procedia PDF Downloads 463777 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis
Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith
Abstract:
Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.Keywords: information technology, electronic patient records, digitisation, paperless healthcare
Procedia PDF Downloads 923776 Numerical Simulation of Production of Microspheres from Polymer Emulsion in Microfluidic Device toward Using in Drug Delivery Systems
Authors: Nizar Jawad Hadi, Sajad Abd Alabbas
Abstract:
Because of their ability to encapsulate and release drugs in a controlled manner, microspheres fabricated from polymer emulsions using microfluidic devices have shown promise for drug delivery applications. In this study, the effects of velocity, density, viscosity, and surface tension, as well as channel diameter, on microsphere generation were investigated using Fluent Ansys software. The software was programmed with the physical properties of the polymer emulsion such as density, viscosity and surface tension. Simulation will then be performed to predict fluid flow and microsphere production and improve the design of drug delivery applications based on changes in these parameters. The effects of capillary and Weber numbers are also studied. The results of the study showed that the size of the microspheres can be controlled by adjusting the speed and diameter of the channel. Narrower microspheres resulted from narrower channel widths and higher flow rates, which could improve drug delivery efficiency, while smaller microspheres resulted from lower interfacial surface tension. The viscosity and density of the polymer emulsion significantly affected the size of the microspheres, ith higher viscosities and densities producing smaller microspheres. The loading and drug release properties of the microspheres created with the microfluidic technique were also predicted. The results showed that the microspheres can efficiently encapsulate drugs and release them in a controlled manner over a period of time. This is due to the high surface area to volume ratio of the microspheres, which allows for efficient drug diffusion. The ability to tune the manufacturing process using factors such as speed, density, viscosity, channel diameter, and surface tension offers a potential opportunity to design drug delivery systems with greater efficiency and fewer side effects.Keywords: polymer emulsion, microspheres, numerical simulation, microfluidic device
Procedia PDF Downloads 653775 Dynamic Web-Based 2D Medical Image Visualization and Processing Software
Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail
Abstract:
In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN
Procedia PDF Downloads 1603774 A Simple User Administration View of Computing Clusters
Authors: Valeria M. Bastos, Myrian A. Costa, Matheus Ambrozio, Nelson F. F. Ebecken
Abstract:
In this paper a very simple and effective user administration view of computing clusters systems is implemented in order of friendly provide the configuration and monitoring of distributed application executions. The user view, the administrator view, and an internal control module create an illusionary management environment for better system usability. The architecture, properties, performance, and the comparison with others software for cluster management are briefly commented.Keywords: big data, computing clusters, administration view, user view
Procedia PDF Downloads 3313773 The Lived Experience of Caregiving as a Vulnerable Person: Preliminary Findings of an Applied Hermeneutic Phenomenology Study
Authors: Amanda Aliende da Matta
Abstract:
In different fields, there are people who have something that stands out. In the educational world, for example, it is clear when some teachers have something: they are the best teachers, but this is not directly attributed to their disciplines, methodologies, etc. It is that they have something that captivates, inspires, and motivates. But we also find this something in other contexts. In this thesis, the interest is in something that some marginalized people, such as Ab (fictitious name), have. Ab was born in a rural community and saw the lifestyle of his family change drastically as a consequence of structural changes in his village. The community became impoverished, and together with a group of teenagers, he decided to migrate to Spain in search of opportunities. His best friend drowned during the crossing. After arriving, he lived in indecent conditions and felt unsafe. He now suffers from anxiety and frequently faints from it. Yet, he’s linked to Joves x la pau (a Christian project, although he is a Muslim), distributing food for people who live on the streets every Thursday afternoon. When he asked about what happens on cold and rainy days, he explained simply: "if it rains, I distribute the food, and immediately I get home, take a bath, and sleep warm under my roof. That is when we most have to go." This something he has will be called caring. And one of the general objectives of the thesis is to discover what are the meaning structures of this caring what is the lived experience of this caring. In this communication, preliminary results of an Applied Hermeneutic Phenomenology (AHP) study on the lived experience of caring as a vulnerable person are presented. The research means to answer what is the lived experience of caring as a vulnerable person. That is, to describe and explain what it is like to caregive for a vulnerable person, what it is, essentially, to caregive for a vulnerable person, what makes the lived experience of caregiving for a vulnerable person different from any other. In order to investigate the meaning of the phenomenon of caregiving as a vulnerable person, as already stated, the method used will be Applied Hermeneutic Phenomenology (AHP). We base ourselves, initially, on the proposal of Raquel Ayala-Carabajo and Max Van Manen. As Van Manen (1990) explains, AHP is a method that works essentially through fieldwork, with the collection of data on lived experience (experiential material). It is a phenomenology of practice. We here present the provisional themes we found: caregiving as a vulnerable person is seeing yourself in the other, identifying with the care-receiver; Caregiving as a vulnerable person is putting the other’s need before oneself’s; Caregiving as a vulnerable person is temporarily overcoming your weaknesses to make yourself strong for the other; Caregiving as a vulnerable person is going beyond the conventional approach; and Caregiving as a vulnerable person is taking responsibility even if it’s not yours.Keywords: applied hermeneutic phenomenology, care ethics, hermeneutics, phenomenology
Procedia PDF Downloads 933772 A Research on Determining the Viability of a Job Board Website for Refugees in Kenya
Authors: Prince Mugoya, Collins Oduor Ondiek, Patrick Kanyi Wamuyu
Abstract:
Refugee Job Board Website is a web-based application that provides a platform for organizations to post jobs specifically for refugees. Organizations upload job opportunities and refugees can view them on the website. The website also allows refugees to input their skills and qualifications. The methodology used to develop this system is a waterfall (traditional) methodology. Software development tools include Brackets which will be used to code the website and PhpMyAdmin to store all the data in a database.Keywords: information technology, refugee, skills, utilization, economy, jobs
Procedia PDF Downloads 1653771 Precise CNC Machine for Multi-Tasking
Authors: Haroon Jan Khan, Xian-Feng Xu, Syed Nasir Shah, Anooshay Niazi
Abstract:
CNC machines are not only used on a large scale but also now become a prominent necessity among households and smaller businesses. Printed Circuit Boards manufactured by the chemical process are not only risky and unsafe but also expensive and time-consuming. A 3-axis precise CNC machine has been developed, which not only fabricates PCB but has also been used for multi-tasks just by changing the materials used and tools, making it versatile. The advanced CNC machine takes data from CAM software. The TB-6560 controller is used in the CNC machine to adjust variation in the X, Y, and Z axes. The advanced machine is efficient in automatic drilling, engraving, and cutting.Keywords: CNC, G-code, CAD, CAM, Proteus, FLATCAM, Easel
Procedia PDF Downloads 1603770 Numerical Modelling of Prestressed Geogrid Reinforced Soil System
Authors: Soukat Kumar Das
Abstract:
Rapid industrialization and increase in population has resulted in the scarcity of suitable ground conditions. It has driven the need of ground improvement by means of reinforcement with geosynthetics with the minimum possible settlement and with maximum possible safety. Prestressing the geosynthetics offers an economical yet safe method of gaining the goal. Commercially available software PLAXIS 3D has made the analysis of prestressed geosynthetics simpler with much practical simulations of the ground. Attempts have been made so far to analyse the effect of prestressing geosynthetics and the effect of interference of footing on Unreinforced (UR), Geogrid Reinforced (GR) and Prestressed Geogrid Reinforced (PGR) soil on the load bearing capacity and the settlement characteristics of prestressed geogrid reinforced soil using the numerical analysis by using the software PLAXIS 3D. The results of the numerical analysis have been validated and compared with those given in the referred paper. The results have been found to be in very good agreement with those of the actual field values with very small variation. The GR soil has been found to be improve the bearing pressure 240 % whereas the PGR soil improves it by almost 500 % for 1mm settlement. In fact, the PGR soil has enhanced the bearing pressure of the GR soil by almost 200 %. The settlement reduction has also been found to be very significant as for 100 kPa bearing pressure the settlement reduction of the PGR soil has been found to be about 88 % with respect to UR soil and it reduced to up to 67 % with respect to GR soil. The prestressing force has resulted in enhanced reinforcement mechanism, resulting in the increased bearing pressure. The deformation at the geogrid layer has been found to be 13.62 mm for GR soil whereas it decreased down to mere 3.5 mm for PGR soil which certainly ensures the effect of prestressing on the geogrid layer. The parameter Improvement factor or conventionally known as Bearing Capacity Ratio for different settlements and which depicts the improvement of the PGR with respect to UR and GR soil and the improvement of GR soil with respect to UR soil has been found to vary in the range of 1.66-2.40 in the present analysis for GR soil and was found to be vary between 3.58 and 5.12 for PGR soil with respect to UR soil. The effect of prestressing was also observed in case of two interfering square footings. The centre to centre distance between the two footings (SFD) was taken to be B, 1.5B, 2B, 2.5B and 3B where B is the width of the footing. It was found that for UR soil the improvement of the bearing pressure was up to 1.5B after which it remained almost same. But for GR soil the zone of influence rose up to 2B and for PGR it further went up to 2.5B. So the zone of interference for PGR soil has increased by 67% than Unreinforced (UR) soil and almost 25 % with respect to GR soil.Keywords: bearing, geogrid, prestressed, reinforced
Procedia PDF Downloads 4023769 Transforming Data Science Curriculum Through Design Thinking
Authors: Samar Swaid
Abstract:
Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.Keywords: data science, design thinking, AI, currculum, transformation
Procedia PDF Downloads 813768 Using Computer Simulations to Prepare Teachers
Authors: Roberta Gentry
Abstract:
The presentation will begin with a brief literature review of the use of computer simulation in teacher education programs. This information will be summarized. Additionally, based on the literature review, advantages and disadvantages of using computer simulation in higher education will be shared. Finally, a study in which computer simulations software was used with 50 initial licensure teacher candidates in both an introductory course and a behavior management course will be shared. Candidates reflected on their experiences with using computer simulation. The instructor of the course will also share lessons learned.Keywords: simulations, teacher education, teacher preparation, educational research
Procedia PDF Downloads 6503767 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes
Authors: Misra Ayse Adsiz, Selim Selvi
Abstract:
In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.Keywords: agile, design, missile, scrum
Procedia PDF Downloads 1683766 Measurement and Modelling of HIV Epidemic among High Risk Groups and Migrants in Two Districts of Maharashtra, India: An Application of Forecasting Software-Spectrum
Authors: Sukhvinder Kaur, Ashok Agarwal
Abstract:
Background: For the first time in 2009, India was able to generate estimates of HIV incidence (the number of new HIV infections per year). Analysis of epidemic projections helped in revealing that the number of new annual HIV infections in India had declined by more than 50% during the last decade (GOI Ministry of Health and Family Welfare, 2010). Then, National AIDS Control Organisation (NACO) planned to scale up its efforts in generating projections through epidemiological analysis and modelling by taking recent available sources of evidence such as HIV Sentinel Surveillance (HSS), India Census data and other critical data sets. Recently, NACO generated current round of HIV estimates-2012 through globally recommended tool “Spectrum Software” and came out with the estimates for adult HIV prevalence, annual new infections, number of people living with HIV, AIDS-related deaths and treatment needs. State level prevalence and incidence projections produced were used to project consequences of the epidemic in spectrum. In presence of HIV estimates generated at state level in India by NACO, USIAD funded PIPPSE project under the leadership of NACO undertook the estimations and projections to district level using same Spectrum software. In 2011, adult HIV prevalence in one of the high prevalent States, Maharashtra was 0.42% ahead of the national average of 0.27%. Considering the heterogeneity of HIV epidemic between districts, two districts of Maharashtra – Thane and Mumbai were selected to estimate and project the number of People-Living-with-HIV/AIDS (PLHIV), HIV-prevalence among adults and annual new HIV infections till 2017. Methodology: Inputs in spectrum included demographic data from Census of India since 1980 and sample registration system, programmatic data on ‘Alive and on ART (adult and children)’,‘Mother-Baby pairs under PPTCT’ and ‘High Risk Group (HRG)-size mapping estimates’, surveillance data from various rounds of HSS, National Family Health Survey–III, Integrated Biological and Behavioural Assessment and Behavioural Sentinel Surveillance. Major Findings: Assuming current programmatic interventions in these districts, an estimated decrease of 12% points in Thane and 31% points in Mumbai among new infections in HRGs and migrants is observed from 2011 by 2017. Conclusions: Project also validated decrease in HIV new infection among one of the high risk groups-FSWs using program cohort data since 2012 to 2016. Though there is a decrease in HIV prevalence and new infections in Thane and Mumbai, further decrease is possible if appropriate programme response, strategies and interventions are envisaged for specific target groups based on this evidence. Moreover, evidence need to be validated by other estimation/modelling techniques; and evidence can be generated for other districts of the state, where HIV prevalence is high and reliable data sources are available, to understand the epidemic within the local context.Keywords: HIV sentinel surveillance, high risk groups, projections, new infections
Procedia PDF Downloads 2113765 A Four-Step Ortho-Rectification Procedure for Geo-Referencing Video Streams from a Low-Cost UAV
Authors: B. O. Olawale, C. R. Chatwin, R. C. D. Young, P. M. Birch, F. O. Faithpraise, A. O. Olukiran
Abstract:
Ortho-rectification is the process of geometrically correcting an aerial image such that the scale is uniform. The ortho-image formed from the process is corrected for lens distortion, topographic relief, and camera tilt. This can be used to measure true distances, because it is an accurate representation of the Earth’s surface. Ortho-rectification and geo-referencing are essential to pin point the exact location of targets in video imagery acquired at the UAV platform. This can only be achieved by comparing such video imagery with an existing digital map. However, it is only when the image is ortho-rectified with the same co-ordinate system as an existing map that such a comparison is possible. The video image sequences from the UAV platform must be geo-registered, that is, each video frame must carry the necessary camera information before performing the ortho-rectification process. Each rectified image frame can then be mosaicked together to form a seamless image map covering the selected area. This can then be used for comparison with an existing map for geo-referencing. In this paper, we present a four-step ortho-rectification procedure for real-time geo-referencing of video data from a low-cost UAV equipped with multi-sensor system. The basic procedures for the real-time ortho-rectification are: (1) Decompilation of video stream into individual frames; (2) Finding of interior camera orientation parameters; (3) Finding the relative exterior orientation parameters for each video frames with respect to each other; (4) Finding the absolute exterior orientation parameters, using self-calibration adjustment with the aid of a mathematical model. Each ortho-rectified video frame is then mosaicked together to produce a 2-D planimetric mapping, which can be compared with a well referenced existing digital map for the purpose of georeferencing and aerial surveillance. A test field located in Abuja, Nigeria was used for testing our method. Fifteen minutes video and telemetry data were collected using the UAV and the data collected were processed using the four-step ortho-rectification procedure. The results demonstrated that the geometric measurement of the control field from ortho-images are more reliable than those from original perspective photographs when used to pin point the exact location of targets on the video imagery acquired by the UAV. The 2-D planimetric accuracy when compared with the 6 control points measured by a GPS receiver is between 3 to 5 meters.Keywords: geo-referencing, ortho-rectification, video frame, self-calibration
Procedia PDF Downloads 4783764 Numerical Simulation of Precast Concrete Panels for Airfield Pavement
Authors: Josef Novák, Alena Kohoutková, Vladimír Křístek, Jan Vodička
Abstract:
Numerical analysis software belong to the main tools for simulating the real behavior of various concrete structures and elements. In comparison with experimental tests, they offer an affordable way to study the mechanical behavior of structures under various conditions. The contribution deals with a precast element of an innovative airfield pavement system which is being developed within an ongoing scientific project. The proposed system consists a two-layer surface course of precast concrete panels positioned on a two-layer base of fiber-reinforced concrete with recycled aggregate. As the panels are supposed to be installed directly on the hardened base course, imperfections at the interface between the base course and surface course are expected. Considering such circumstances, three various behavior patterns could be established and considered when designing the precast element. Enormous costs of full-scale experiments force to simulate the behavior of the element in a numerical analysis software using finite element method. The simulation was conducted on a nonlinear model in order to obtain such results which could fully compensate results from the experiments. First, several loading schemes were considered with the aim to observe the critical one which was used for the simulation later on. The main objective of the simulation was to optimize reinforcement of the element subject to quasi-static loading from airplanes. When running the simulation several parameters were considered. Namely, it concerns geometrical imperfections, manufacturing imperfections, stress state in reinforcement, stress state in concrete and crack width. The numerical simulation revealed that the precast element should be heavily reinforced to fulfill all the demands assumed. The main cause of using high amount of reinforcement is the size of the imperfections which could occur at real structure. Improving manufacturing quality, the installation of the precast panels on a fresh base course or using a bedding layer underneath the surface course belong to the main steps how to reduce the size of imperfections and consequently lower the consumption of reinforcement.Keywords: nonlinear analysis, numerical simulation, precast concrete, pavement
Procedia PDF Downloads 2563763 The Relationship between Personal, Psycho-Social and Occupational Risk Factors with Low Back Pain Severity in Industrial Workers
Authors: Omid Giahi, Ebrahim Darvishi, Mahdi Akbarzadeh
Abstract:
Introduction: Occupational low back pain (LBP) is one of the most prevalent work-related musculoskeletal disorders in which a lot of risk factors are involved that. The present study focuses on the relation between personal, psycho-social and occupational risk factors and LBP severity in industrial workers. Materials and Methods: This research was a case-control study which was conducted in Kurdistan province. 100 workers (Mean Age ± SD of 39.9 ± 10.45) with LBP were selected as the case group, and 100 workers (Mean Age ± SD of 37.2 ± 8.5) without LBP were assigned into the control group. All participants were selected from various industrial units, and they had similar occupational conditions. The required data including demographic information (BMI, smoking, alcohol, and family history), occupational (posture, mental workload (MWL), force, vibration and repetition), and psychosocial factors (stress, occupational satisfaction and security) of the participants were collected via consultation with occupational medicine specialists, interview, and the related questionnaires and also the NASA-TLX software and REBA worksheet. Chi-square test, logistic regression and structural equation modeling (SEM) were used to analyze the data. For analysis of data, IBM Statistics SPSS 24 and Mplus6 software have been used. Results: 114 (77%) of the individuals were male and 86 were (23%) female. Mean Career length of the Case Group and Control Group were 10.90 ± 5.92, 9.22 ± 4.24, respectively. The statistical analysis of the data revealed that there was a significant correlation between the Posture, Smoking, Stress, Satisfaction, and MWL with occupational LBP. The odds ratios (95% confidence intervals) derived from a logistic regression model were 2.7 (1.27-2.24) and 2.5 (2.26-5.17) and 3.22 (2.47-3.24) for Stress, MWL, and Posture, respectively. Also, the SEM analysis of the personal, psycho-social and occupational factors with LBP revealed that there was a significant correlation. Conclusion: All three broad categories of risk factors simultaneously increase the risk of occupational LBP in the workplace. But, the risks of Posture, Stress, and MWL have a major role in LBP severity. Therefore, prevention strategies for persons in jobs with high risks for LBP are required to decrease the risk of occupational LBP.Keywords: industrial workers occupational, low back pain, occupational risk factors, psychosocial factors
Procedia PDF Downloads 2583762 Enhanced CNN for Rice Leaf Disease Classification in Mobile Applications
Authors: Kayne Uriel K. Rodrigo, Jerriane Hillary Heart S. Marcial, Samuel C. Brillo
Abstract:
Rice leaf diseases significantly impact yield production in rice-dependent countries, affecting their agricultural sectors. As part of precision agriculture, early and accurate detection of these diseases is crucial for effective mitigation practices and minimizing crop losses. Hence, this study proposes an enhancement to the Convolutional Neural Network (CNN), a widely-used method for Rice Leaf Disease Image Classification, by incorporating MobileViTV2—a recently advanced architecture that combines CNN and Vision Transformer models while maintaining fewer parameters, making it suitable for broader deployment on edge devices. Our methodology utilizes a publicly available rice disease image dataset from Kaggle, which was validated by a university structural biologist following the guidelines provided by the Philippine Rice Institute (PhilRice). Modifications to the dataset include renaming certain disease categories and augmenting the rice leaf image data through rotation, scaling, and flipping. The enhanced dataset was then used to train the MobileViTV2 model using the Timm library. The results of our approach are as follows: the model achieved notable performance, with 98% accuracy in both training and validation, 6% training and validation loss, and a Receiver Operating Characteristic (ROC) curve ranging from 95% to 100% for each label. Additionally, the F1 score was 97%. These metrics demonstrate a significant improvement compared to a conventional CNN-based approach, which, in a previous 2022 study, achieved only 78% accuracy after using 5 convolutional layers and 2 dense layers. Thus, it can be concluded that MobileViTV2, with its fewer parameters, outperforms traditional CNN models, particularly when applied to Rice Leaf Disease Image Identification. For future work, we recommend extending this model to include datasets validated by international rice experts and broadening the scope to accommodate biotic factors such as rice pest classification, as well as abiotic stressors such as climate, soil quality, and geographic information, which could improve the accuracy of disease prediction.Keywords: convolutional neural network, MobileViTV2, rice leaf disease, precision agriculture, image classification, vision transformer
Procedia PDF Downloads 243761 Use of Coconut Shell as a Replacement of Normal Aggregates in Rigid Pavements
Authors: Prakash Parasivamurthy, Vivek Rama Das, Ravikant Talluri, Veena Jawali
Abstract:
India ranks among third in the production of coconut besides Philippines and Indonesia. About 92% of the total production in the country is contributed from four southern states especially, Kerala (45.22%), Tamil Nadu (26.56%), Karnataka (10.85%), and Andhra Pradesh (8.93%). Other states, such as Goa, Maharashtra, Odisha, West Bengal, and those in the northeast (Tripura and Assam) account for the remaining 8.44%. The use of coconut shell as coarse aggregate in concrete has never been a usual practice in the industry, particularly in areas where light weight concrete is required for non-load bearing walls, non-structural floors, and strip footings. The high cost of conventional building materials is a major factor affecting construction delivery in India. In India, where abundant agricultural and industrial wastes are discharged, these wastes can be used as potential material or replacement material in the construction industry. This will have double the advantages viz., reduction in the cost of construction material and also as a means of disposal of wastes. Therefore, an attempt has been made in this study to utilize the coconut shell (CS) as coarse aggregate in rigid pavement. The present study was initiated with the characterization of materials by the basic material testing. The casted moulds are cured and tests are conducted for hardened concrete. The procedure is continued with determination of fck (Characteristic strength), E (Modulus of Elasticity) and µ (Poisson Value) by the test results obtained. For the analytical studies, rigid pavement was modeled by the KEN PAVE software, finite element software developed specially for road pavements and simultaneously design of rigid pavement was carried out with Indian standards. Results show that physical properties of CSAC (Coconut Shell Aggregate Concrete) with 10% replacement gives better results. The flexural strength of CSAC is found to increase by 4.25% as compared to control concrete. About 13 % reduction in pavement thickness is observed using optimum coconut shell.Keywords: coconut shell, rigid pavement, modulus of elasticity, poison ratio
Procedia PDF Downloads 2373760 Analysis of Barbell Kinematics of Snatch Technique among Women Weightlifters in India
Authors: Manish Kumar Pillai, Madhavi Pathak Pillai, Rajender Lal, Dinesh P. Sharma
Abstract:
India has not yet been able to produce many weightlifters in the past years. Karnam Malleshwari is the only woman to win a medal for India in Olympics. When we try to introspect, there seem to be different reasons. One of the probable cause could be the lack of biomechanical analysis for technique improvements. The analysis of motion in sports has gained prime importance for technical improvement. It helps an athlete to develop a better understanding of his own skills and increasing the rate of technical learning process. Kinematics is concerned with describing and quantifying both the linear and angular position of bodies and their time derivatives. The techniques analysis of barbell movement is very important in weightlifting. But women weightlifting has a shorter history than men’s. Research on women weightlifting based on video analysis is less; there is a lack of scientific evidence based on kinematic analysis of especially on Indian weightlifters at national level are limited. Hence, the present investigation was aimed to analyze the barbell kinematics of women weightlifters in India. The study was delimited to the medal winners of 69-kilogram weight category in the All India Inter-University Competition, age ranging between 18 and 28 years. The variables selected for the mechanical analysis of Barbell kinematics included barbell trajectory, velocity, acceleration, potential energy, kinetic energy, mechanical energy, and average power output. The performance was captured during the competition by two DV PC-60 Digital cameras (Panasonic Company, Ltd). Two cameras were placed 6-meters perpendicular to the plane of the motion, 130 cm. above the ground to record/capture the frontal and lateral view of the lifters simultaneously. Video recordings were analyzed by using Dartfish software, and barbell kinematics were analyzed with the information derived with the help of software. The result documented on the basis of the finding of the study clearly states that there are differences in the selected kinematic variables in all three lifters in respect to their technique in five phases during snatch technique using by them.Keywords: dartfish, digital camera, kinematic, snatch, weightlifting
Procedia PDF Downloads 1363759 Neighbor Caring Environment System (NCE) Using Parallel Replication Mechanism
Authors: Ahmad Shukri Mohd Noor, Emma Ahmad Sirajudin, Rabiei Mamat
Abstract:
Pertaining to a particular Marine interest, the process of data sampling could take years before a study can be concluded. Therefore, the need for a robust backup system for the data is invariably implicit. In recent advancement of Marine applications, more functionalities and tools are integrated to assist the work of the researchers. It is anticipated that this modality will continue as research scope widens and intensifies and at the same to follow suit with current technologies and lifestyles. The convenience to collect and share information these days also applies to the work in Marine research. Therefore, Marine system designers should be aware that high availability is a necessary attribute in Marine repository applications as well as a robust backup system for the data. In this paper, the approach to high availability is related both to hardware and software but the focus is more on software. We consider a NABTIC repository system that is primitively built on a single server and does not have replicated components. First, the system is decomposed into separate modules. The modules are placed on multiple servers to create a distributed system. Redundancy is added by placing the copies of the modules on different servers using Neighbor Caring Environment System(NCES) technique. NCER is utilizing parallel replication components mechanism. A background monitoring is established to check servers’ heartbeats to confirm their aliveness. At the same time, a critical adaptive threshold is maintained to make sure a failure is timely detected using Adaptive Fault Detection (AFD). A confirmed failure will set the recovery mode where a selection process will be done before a fail-over server is instructed. In effect, the Marine repository service is continued as the fail-over masks a recent failure. The performance of the new prototype is tested and is confirmed to be more highly available. Furthermore, the downtime is not noticeable as service is immediately restored automatically. The Marine repository system is said to have achieved fault tolerance.Keywords: availability, fault detection, replication, fault tolerance, marine application
Procedia PDF Downloads 3213758 Identifying Diabetic Retinopathy Complication by Predictive Techniques in Indian Type 2 Diabetes Mellitus Patients
Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad
Abstract:
Predicting the risk of diabetic retinopathy (DR) in Indian type 2 diabetes patients is immensely necessary. India, being the second largest country after China in terms of a number of diabetic patients, to the best of our knowledge not a single risk score for complications has ever been investigated. Diabetic retinopathy is a serious complication and is the topmost reason for visual impairment across countries. Any type or form of DR has been taken as the event of interest, be it mild, back, grade I, II, III, and IV DR. A sample was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of DR. Cox proportional hazard regression is used to design risk scores for the prediction of retinopathy. Model calibration and discrimination are assessed from Hosmer Lemeshow and area under receiver operating characteristic curve (ROC). Overfitting and underfitting of the model are checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Optimal cut off point is chosen by Youden’s index. Five-year probability of DR is predicted by both survival function, and Markov chain two state model and the better technique is concluded. The risk scores developed can be applied by doctors and patients themselves for self evaluation. Furthermore, the five-year probabilities can be applied as well to forecast and maintain the condition of patients. This provides immense benefit in real application of DR prediction in T2DM.Keywords: Cox proportional hazard regression, diabetic retinopathy, ROC curve, type 2 diabetes mellitus
Procedia PDF Downloads 1863757 Expression of PGC-1 Alpha Isoforms in Response to Eccentric and Concentric Resistance Training in Healthy Subjects
Authors: Pejman Taghibeikzadehbadr
Abstract:
Background and Aim: PGC-1 alpha is a transcription factor that was first detected in brown adipose tissue. Since its discovery, PGC-1 alpha has been known to facilitate beneficial adaptations such as mitochondrial biogenesis and increased angiogenesis in skeletal muscle following aerobic exercise. Therefore, the purpose of this study was to investigate the expression of PGC-1 alpha isoforms in response to eccentric and concentric resistance training in healthy subjects. Materials and Methods: Ten healthy men were randomly divided into two groups (5 patients in eccentric group - 5 in eccentric group). Isokinetic contraction protocols included eccentric and concentric knee extension with maximum power and angular velocity of 60 degrees per second. The torques assigned to each subject were considered to match the workload in both protocols, with a rotational speed of 60 degrees per second. Contractions consisted of a maximum of 12 sets of 10 repetitions for the right leg, a rest time of 30 seconds between each set. At the beginning and end of the study, biopsy of the lateral broad muscle tissue was performed. Biopsies were performed in both distal and proximal directions of the lateral flank. To evaluate the expression of PGC1α-1 and PGC1α-4 genes, tissue analysis was performed in each group using Real-Time PCR technique. Data were analyzed using dependent t-test and covariance test. SPSS21 software and Exell 2013 software were used for data analysis. Results: The results showed that intra-group changes of PGC1α-1 after one session of activity were not significant in eccentric (p = 0.168) and concentric (p = 0.959) groups. Also, inter-group changes showed no difference between the two groups (p = 0.681). Also, intra-group changes of PGC1α-4 after one session of activity were significant in an eccentric group (p = 0.012) and concentric group (p = 0.02). Also, inter-group changes showed no difference between the two groups (p = 0.362). Conclusion: It seems that the lack of significant changes in the desired variables due to the lack of exercise pressure is sufficient to stimulate the increase of PGC1α-1 and PGC1α-4. And with regard to reviewing the answer, it seems that the compatibility debate has different results that need to be addressed.Keywords: eccentric contraction, concentric contraction, PGC1α-1 و PGC1α-4, human subject
Procedia PDF Downloads 78