Search results for: Solanki Ravirajsinh
20 Estimation of Population Mean Using Characteristics of Poisson Distribution: An Application to Earthquake Data
Authors: Prayas Sharma
Abstract:
This paper proposed a generalized class of estimators, an exponential class of estimators based on the adaption of Sharma and Singh (2015) and Solanki and Singh (2013), and a simple difference estimator for estimating unknown population mean in the case of Poisson distributed population in simple random sampling without replacement. The expressions for mean square errors of the proposed classes of estimators are derived from the first order of approximation. It is shown that the adapted version of Solanki and Singh (2013), the exponential class of estimator, is always more efficient than the usual estimator, ratio, product, exponential ratio, and exponential product type estimators and equally efficient to simple difference estimator. Moreover, the adapted version of Sharma and Singh's (2015) estimator is always more efficient than all the estimators available in the literature. In addition, theoretical findings are supported by an empirical study to show the superiority of the constructed estimators over others with an application to earthquake data of Turkey.Keywords: auxiliary attribute, point bi-serial, mean square error, simple random sampling, Poisson distribution
Procedia PDF Downloads 15419 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 2718 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 4217 Medi-Conf: Conference Management System
Authors: Dishant Kothari, Pankaj Gaur, Priyanshu Sharma, Ratnesh Litoriya, Sachin Solanki, Shimpy Goyal
Abstract:
Web based Conference Management System comprises of all the processes needed for round table conference, research paper publication includes the phases-call for paper, paper submission, paper review, acknowledgement to the author, paper acceptance and payment for publication. It will also help colleges and universities to conduct conferences for research, thus spreading awareness and will contribute to the overall development of students. Web based Conference Management System will streamline the procedure for paper publication by reducing the time and efforts needed in physical (offline mode) submission. A conference can be organized from anywhere and anytime. Authors can easily trace the status of the paper, and the program committee can review them anywhere and provide necessary comments to it.Keywords: peer review, paper publication, author, chair, reviewer, virtualization, new normal
Procedia PDF Downloads 12916 Analysis of Various Copy Move Image Forgery Techniques for Better Detection Accuracy
Authors: Grishma D. Solanki, Karshan Kandoriya
Abstract:
In modern era of information age, digitalization has revolutionized like never before. Powerful computers, advanced photo editing software packages and high resolution capturing devices have made manipulation of digital images incredibly easy. As per as image forensics concerns, one of the most actively researched area are detection of copy move forgeries. Higher computational complexity is one of the major component of existing techniques to detect such tampering. Moreover, copy move forgery is usually performed in three steps. First, copying of a region in an image then pasting the same one in the same respective image and finally doing some post-processing like rotation, scaling, shift, noise, etc. Consequently, pseudo Zernike moment is used as a features extraction method for matching image blocks and as a primary factor on which performance of detection algorithms depends.Keywords: copy-move image forgery, digital forensics, image forensics, image forgery
Procedia PDF Downloads 28715 Prediction of Permeability of Frozen Unsaturated Soil Using Van Genuchten Model and Fredlund-Xing Model in Soil Vision
Authors: Bhavita S. Dave, Jaimin Vaidya, Chandresh H. Solanki, Atul K.
Abstract:
To measure the permeability of a soil specimen, one of the basic assumptions of Darcy's law is that the soil sample should be saturated. Unlike saturated soils, the permeability of unsaturated soils cannot be found using conventional methods as it does not follow Darcy's law. Many empirical models, such as the Van Genuchten Model and Fredlund-Xing Model were suggested to predict permeability value for unsaturated soil. Such models use data from the soil-freezing characteristic curve to find fitting parameters for frozen unsaturated soils. In this study, soil specimens were subjected to 0, 1, 3, and 5 freezing-thawing (F-T) cycles for different degrees of saturation to have a wide range of suction, and its soil freezing characteristic curves were formulated for all F-T cycles. Changes in fitting parameters and relative permeability with subsequent F-T cycles are presented in this paper for both models.Keywords: frozen unsaturated soil, Fredlund Xing model, soil-freezing characteristic curve, Van Genuchten model
Procedia PDF Downloads 18814 Universal Design Building Standard for India: A Critical Inquiry
Authors: Sushil Kumar Solanki, Rachna Khare
Abstract:
Universal Design is a concept of built environment creation, where all people are facilitated to the maximum extent possible without using any type of specialized design. However, accessible design is a design process in which the needs of people with disabilities are specifically considered. Building standards on accessibility contains scoping and technical requirements for accessibility to sites, facilities, building and elements by individual with disability. India is also following its prescriptive types of various building standards for the creation of physical environment for people with disabilities. These building standards are based on western models instead of research based standards to serve Indian needs. These standards lack contextual connect when reflects in its application in the urban and rural environment. This study focuses on critical and comparative study of various international building standards and codes, with existing Indian accessibility standards to understand problems and prospects of concept of Universal Design building standards for India. The result of this study is an analysis of existing state of Indian building standard pertaining to accessibility and future need of performance based Universal Design concept.Keywords: accessibility, building standard, built-environment, universal design
Procedia PDF Downloads 29413 Engagement Analysis Using DAiSEE Dataset
Authors: Naman Solanki, Souraj Mondal
Abstract:
With the world moving towards online communication, the video datastore has exploded in the past few years. Consequently, it has become crucial to analyse participant’s engagement levels in online communication videos. Engagement prediction of people in videos can be useful in many domains, like education, client meetings, dating, etc. Video-level or frame-level prediction of engagement for a user involves the development of robust models that can capture facial micro-emotions efficiently. For the development of an engagement prediction model, it is necessary to have a widely-accepted standard dataset for engagement analysis. DAiSEE is one of the datasets which consist of in-the-wild data and has a gold standard annotation for engagement prediction. Earlier research done using the DAiSEE dataset involved training and testing standard models like CNN-based models, but the results were not satisfactory according to industry standards. In this paper, a multi-level classification approach has been introduced to create a more robust model for engagement analysis using the DAiSEE dataset. This approach has recorded testing accuracies of 0.638, 0.7728, 0.8195, and 0.866 for predicting boredom level, engagement level, confusion level, and frustration level, respectively.Keywords: computer vision, engagement prediction, deep learning, multi-level classification
Procedia PDF Downloads 11212 A Novel Approach towards Test Case Prioritization Technique
Authors: Kamna Solanki, Yudhvir Singh, Sandeep Dalal
Abstract:
Software testing is a time and cost intensive process. A scrutiny of the code and rigorous testing is required to identify and rectify the putative bugs. The process of bug identification and its consequent correction is continuous in nature and often some of the bugs are removed after the software has been launched in the market. This process of code validation of the altered software during the maintenance phase is termed as Regression testing. Regression testing ubiquitously considers resource constraints; therefore, the deduction of an appropriate set of test cases, from the ensemble of the entire gamut of test cases, is a critical issue for regression test planning. This paper presents a novel method for designing a suitable prioritization process to optimize fault detection rate and performance of regression test on predefined constraints. The proposed method for test case prioritization m-ACO alters the food source selection criteria of natural ants and is basically a modified version of Ant Colony Optimization (ACO). The proposed m-ACO approach has been coded in 'Perl' language and results are validated using three examples by computation of Average Percentage of Faults Detected (APFD) metric.Keywords: regression testing, software testing, test case prioritization, test suite optimization
Procedia PDF Downloads 33711 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment
Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai
Abstract:
Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.Keywords: computational methods, MATLAB, seismic hazard, seismic measurements
Procedia PDF Downloads 33910 Response Surface Methodology to Obtain Disopyramide Phosphate Loaded Controlled Release Ethyl Cellulose Microspheres
Authors: Krutika K. Sawant, Anil Solanki
Abstract:
The present study deals with the preparation and optimization of ethyl cellulose-containing disopyramide phosphate loaded microspheres using solvent evaporation technique. A central composite design consisting of a two-level full factorial design superimposed on a star design was employed for optimizing the preparation microspheres. The drug:polymer ratio (X1) and speed of the stirrer (X2) were chosen as the independent variables. The cumulative release of the drug at a different time (2, 6, 10, 14, and 18 hr) was selected as the dependent variable. An optimum polynomial equation was generated for the prediction of the response variable at time 10 hr. Based on the results of multiple linear regression analysis and F statistics, it was concluded that sustained action can be obtained when X1 and X2 are kept at high levels. The X1X2 interaction was found to be statistically significant. The drug release pattern fitted the Higuchi model well. The data of a selected batch were subjected to an optimization study using Box-Behnken design, and an optimal formulation was fabricated. Good agreement was observed between the predicted and the observed dissolution profiles of the optimal formulation.Keywords: disopyramide phosphate, ethyl cellulose, microspheres, controlled release, Box-Behnken design, factorial design
Procedia PDF Downloads 4559 Forensic Necropsy-Importance in Wildlife Conservation
Authors: G. V. Sai Soumya, Kalpesh Solanki, Sumit K. Choudhary
Abstract:
Necropsy is another term used for an autopsy, which is known as death examination in the case of animals. It is a complete standardized procedure involving dissection, observation, interpretation, and documentation. Government Bodies like National Tiger Conservation Authority (NTCA) have given standard operating procedures for commencing the necropsies. Necropsies are rarely performed as compared to autopsies performed on human bodies. There are no databases which maintain the count of autopsies in wildlife, but the research in this area has shown a very small number of necropsies. Long back, wildlife forensics came into existence but is coming into light nowadays as there is an increase in wildlife crime cases, including the smuggling of trophies, pooching, and many more. Physical examination in cases of animals is not sufficient to yield fruitful information, and thus postmortem examination plays an important role. Postmortem examination helps in the determination of time since death, cause of death, manner of death, factors affecting the case under investigation, and thus decreases the amount of time required in solving cases. Increasing the rate of necropsies will help forensic veterinary pathologists to build standardized provision and confidence within them, which will ultimately yield a higher success rate in solving wildlife crime cases.Keywords: necropsy, wildlife crime, postmortem examination, forensic application
Procedia PDF Downloads 1388 A Study on FWD Deflection Bowl Parameters for Condition Assessment of Flexible Pavement
Authors: Ujjval J. Solanki, Prof.(Dr.) P.J. Gundaliya, Prof.M.D. Barasara
Abstract:
The application of Falling Weight Deflectometer is to evaluate structural performance of the flexible pavement. The exercise of back calculation is required to know the modulus of elasticity of existing in-service pavement. The process of back calculation needs in-depth field experience for the input of range of modulus of elasticity of bituminous, granular and subgrade layer, and its required number of trial to find such matching moduli with the observed FWD deflection on the field. The study carried out at Barnala-Mansa State Highway Punjab-India using FWD before and after overlay; the deflections obtained at 0 on the load cell, 300, 600, 900,1200, 1500 and 1800 mm interval from the load cell these seven deflection results used to calculate Surface Curvature Index (SCI), Base damage Index (BDI), Base curvature index (BCI). This SCI, BCI and BDI indices are useful to predict the structural performance of in-service pavement and also useful to identify homogeneous section for condition assessment. The SCI, BCI and BDI range are determined for before and after overlay the range of SCI 520 to 51 BDI 294 to 63 BCI 83 to 0.27 for old pavement and SCI 272 to 23 BDI 228 to 28, BCI 25.85 to 4.60 for new pavement. It also shows good correlation with back calculated modulus of elasticity of all the three layer.Keywords: back calculation, base damage index, base curvature index, FWD (Falling Weight Deflectometer), surface curvature index
Procedia PDF Downloads 3327 Determination of Suction of Arid Region Soil Using Filter Paper Method
Authors: Bhavita S. Dave, Chandresh H. Solanki, Atul K. Desai
Abstract:
Soils of Greater Himalayas mostly pertain to Leh & Ladakh, Lahaul & Sppiti, & high reaches to Uttarakhand. The moisture regime is aridic. The arid zone starts from Baralacha pass in Lahaul and covers the entire Spiti valley in the district of Lahaul & Spiti, Himachal Pradesh of India. Here, the present study is an attempt to determine the suction value of soil collected from the arid zone of Spiti valley for different freezing-thawing cycles considering the climate ranges of Spiti valley. Suction is the basic and most important parameter which influences the behavior of unsaturated soil. It is essential to determine the suction value of unsaturated soil before other tests like shear test, and permeability. Basically, it is the negative pore water pressure in partially saturated soil measured in terms of the height of the water column. The filter paper method has been used for the study as an economical approach to evaluate suction. It is the only method from which both contact and non-contact suction can be deduced. In this study, soil specimens were subjected to 0, 1, 3, & 5 freezing-thawing (F-T) cycles for different degrees of saturation to have a wide range of suction, and soil freezing characteristic curves (SFCC) were formulated for all F-T cycles. The result data collected from the experiments have shown best-fitted values using Fredlund & Xing model for each SFCC.Keywords: suction, arid region soil, soil freezing characteristic curve, freezing-thawing cycle
Procedia PDF Downloads 2276 Development of Stability Indicating Method and Characterization of Degradation Impurity of Nirmaltrelvir in Its Self-Emulsifying Drug Delivery System
Authors: Ravi Patel, Ravisinh Solanki, Dignesh Khunt
Abstract:
A stability-indicating reverse phase high performance liquid chromatography (RP-HPLC) method was developed and validated for estimating Nirmatrelvir in its self-emulsifying drug delivery system (SEDDS). The separation of Nirmatrelvir and its degradation products was accomplished by employing an Agilent Zorbax Eclipse plus C18 (250 mm x 4.6 mm, 5 µm) column, through which the mobile phase 5 mM phosphate buffer (pH 4.0) as mobile phase A and Acetonitrile as mobile phase B in a ratio of (40:60 % v/v) was pumped at a flow rate of 1.0 mL/min, through the HPLC system. Chromatographic separation and elution were monitored by a photo-diode array detector at 210 nm. Stress studies have been employed to evaluate this method's ability to indicate stability. Nirmatrelvir was exposed to several stress conditions, such as acid, alkali, oxidative, photolytic, and thermal degradations. Significant degradation was observed during acid and alkali hydrolysis, and the resulting degradation product was successfully separated from the Nirmatrelvir peak, preventing any interference. Furthermore, the primary degradant produced under alkali degradation conditions was identified using UPLC-ESI-TQ-MS/MS. The method was validated in accordance with the International Council on Harmonization (ICH) and found to be selective, precise, accurate, linear, and robust. The apparent permeability of Nirmatrelvir SEDDS was 4.20 ± 0.21×10-6 cm/sec, and the average proportion of free drug recovered was 0.5%. The method developed in this study was feasible and accurate for routine quality control evaluation of Nirmatrelvir SEDDS.Keywords: Nirmatrelvir, SEDDS, degradation study, HPLC, LC-MS/MS
Procedia PDF Downloads 165 Characterization of Kopff Crater Using Remote Sensing Data
Authors: Shreekumari Patel, Prabhjot Kaur, Paras Solanki
Abstract:
Moon Mineralogy Mapper (M3), Miniature Radio Frequency (Mini-RF), Kaguya Terrain Camera images, Lunar Orbiter Laser Altimeter (LOLA) digital elevation model (DEM) and Lunar Reconnaissance Orbiter Camera (LROC)- Narrow angle camera (NAC) and Wide angle camera (WAC) images were used to study mineralogy, surface physical properties, and age of the 42 km diameter Kopff crater. M3 indicates the low albedo crater floor to be high-Ca pyroxene dominated associated with floor fracture suggesting the igneous activity of the gabbroic material. Signature of anorthositic material is sampled on the eastern edge as target material is excavated from ~3 km diameter impact crater providing access to the crustal composition. Several occurrences of spinel were detected in northwestern rugged terrain. Our observation can be explained by exposure of spinel by this crater that impacted onto the inner rings of Orientale basin. Spinel was part of the pre-impact target, an intrinsic unit of basin ring. Crater floor was dated by crater counts performed on Kaguya TC images. Nature of surface was studied in detail with LROC NAC and Mini-RF. Freshly exposed surface and boulder or debris seen in LROC NAC images have enhanced radar signal in comparison to mature terrain of Kopff crater. This multidisciplinary analysis of remote sensing data helps to assess lunar surface in detail.Keywords: crater, mineralogy, moon, radar observations
Procedia PDF Downloads 1604 Seismicity and Ground Response Analysis for MP Tourism Office in Indore, India
Authors: Deepshikha Shukla, C. H. Solanki, Mayank Desai
Abstract:
In the last few years, it has been observed that earthquake is proving a threat to the scientist across the world. With a large number of earthquakes occurring in day to day life, the threat to life and property has increased manifolds which call for an urgent attention of all the researchers globally to carry out the research in the field of Earthquake Engineering. Any hazard related to the earthquake and seismicity is considered to be seismic hazards. The common forms of seismic hazards are Ground Shaking, Structure Damage, Structural Hazards, Liquefaction, Landslides, Tsunami to name a few. Among all the natural hazards, the most devastating and damaging is the earthquake as all other hazards are triggered only after the occurrence of an earthquake. In order to quantify and estimate the seismicity and seismic hazards, many methods and approaches have been proposed in the past few years. Such approaches are Mathematical, Conventional and Computational. Convex Set Theory, Empirical Green’s Function are some of the Mathematical Approaches whereas the Deterministic and Probabilistic Approaches are the Conventional Approach for the estimation of the seismic Hazards. Ground response and Ground Shaking of a particular area or region plays an important role in the damage caused due to the earthquake. In this paper, seismic study using Deterministic Approach and 1 D Ground Response Analysis has been carried out for Madhya Pradesh Tourism Office in Indore Region in Madhya Pradesh in Central India. Indore lies in the seismic zone III (IS: 1893, 2002) in the Seismic Zoning map of India. There are various faults and lineament in this area and Narmada Some Fault and Gavilgadh fault are the active sources of earthquake in the study area. Deepsoil v6.1.7 has been used to perform the 1 D Linear Ground Response Analysis for the study area. The Peak Ground Acceleration (PGA) of the city ranges from 0.1g to 0.56g.Keywords: seismicity, seismic hazards, deterministic, probabilistic methods, ground response analysis
Procedia PDF Downloads 1653 Children of Quarantine: A Post COVID-19 Mental Health Dilemma
Authors: Salman Abdul Majeed, Vidur Solanki, Ruqiya Shama Tareen
Abstract:
BACKGROUND: The COVID-19 pandemic has affected the way of living as we have known for all strata of society. While disease containment measures imposed by governmental agencies have been instrumental in controlling the spread of the virus, it has had profound collateral impacts on all populations. However, the disruption caused in the lives of one segment of population has been far more damaging than most others: the emotional wellbeing of our child and adolescent populations. This impact was even more pronounced in children who already suffered from neurodevelopmental or psychiatric disorders. In particular, school closures have not only led to profound social isolation, but also negative impacts on normal developmental opportunities and interruptions in mental health services obtained through school systems. It is too soon to understand the full impacts of quarantine, isolation, stress of social detachment and fear of pandemic, but we have started to see the devastating impact on C&A already. This review intends to shed light on the current understanding of psychiatric wellbeing of C&A during COVID-19 pandemic. METHOD: Literature search utilizing key words COVID-19 and children, quarantine and children, social isolation, Loneliness, pandemic stress and children, and mental health of children, disease containment measures was carried out. Over 200 articles were identified, out of which 81 articles were included in this review article. RESULTS: The disruption caused by COVID-19 in the lives of C&A is much more damaging and its impact is far reaching. The C&A ED visits for possible suicide attempts have jumped to 22.3% in 2020 and 39.1% during 2021. One study utilizing T1-weighted structural images, computed the thickness of cortical and subcortical structures including amygdala, hippocampus, and nucleus accumbens. The Peri-COVID group showed reduced cortical and subcortical thickness and more advanced brain aging compared to pre pandemic studies. CONCLUSION: Mental health resources for C&A remain under funded, neglected, and inaccessible to population that needs it most. Children with ongoing mental health disorders were impacted worst, along with those with predisposed biopsychosocial risk factors.Keywords: COVID-19 and children, quarantine and children, social isolation, Loneliness, pandemic stress and children, disease containment measures, mental health of children
Procedia PDF Downloads 742 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 1051 Methodology for Risk Assessment of Nitrosamine Drug Substance Related Impurities in Glipizide Antidiabetic Formulations
Authors: Ravisinh Solanki, Ravi Patel, Chhaganbhai Patel
Abstract:
Purpose: The purpose of this study is to develop a methodology for the risk assessment and evaluation of nitrosamine impurities in Glipizide antidiabetic formulations. Nitroso compounds, including nitrosamines, have emerged as significant concerns in drug products, as highlighted by the ICH M7 guidelines. This study aims to identify known and potential sources of nitrosamine impurities that may contaminate Glipizide formulations and assess their presence. By determining observed or predicted levels of these impurities and comparing them with regulatory guidance, this research will contribute to ensuring the safety and quality of combination antidiabetic drug products on the market. Factors contributing to the presence of genotoxic nitrosamine contaminants in glipizide medications, such as secondary and tertiary amines, and nitroso group-complex forming molecules, will be investigated. Additionally, conditions necessary for nitrosamine formation, including the presence of nitrosating agents, and acidic environments, will be examined to enhance understanding and mitigation strategies. Method: The methodology for the study involves the implementation of the N-Nitroso Acid Precursor (NAP) test, as recommended by the WHO in 1978 and detailed in the 1980 International Agency for Research on Cancer monograph. Individual glass vials containing equivalent to 10mM quantities of Glipizide is prepared. These compounds are dissolved in an acidic environment and supplemented with 40 mM NaNO2. The resulting solutions are maintained at a temperature of 37°C for a duration of 4 hours. For the analysis of the samples, an HPLC method is employed for fit-for-purpose separation. LC resolution is achieved using a step gradient on an Agilent Eclipse Plus C18 column (4.6 X 100 mm, 3.5µ). Mobile phases A and B consist of 0.1% v/v formic acid in water and acetonitrile, respectively, following a gradient mode program. The flow rate is set at 0.6 mL/min, and the column compartment temperature is maintained at 35°C. Detection is performed using a PDA detector within the wavelength range of 190-400 nm. To determine the exact mass of formed nitrosamine drug substance related impurities (NDSRIs), the HPLC method is transferred to LC-TQ-MS/MS with the same mobile phase composition and gradient program. The injection volume is set at 5 µL, and MS analysis is conducted in Electrospray Ionization (ESI) mode within the mass range of 100−1000 Daltons. Results: The samples of NAP test were prepared according to the protocol. The samples were analyzed using HPLC and LC-TQ-MS/MS identify possible NDSRIs generated in different formulations of glipizide. It was found that the NAP test generated a various NDSRIs. The new finding, which has not been reported yet, discovered contamination of Glipizide. These NDSRIs are categorised based on the predicted carcinogenic potency and recommended its acceptable intact in medicines. The analytical method was found specific and reproducible.Keywords: NDSRI, nitrosamine impurities, antidiabetic, glipizide, LC-MS/MS
Procedia PDF Downloads 31