Search results for: double tax relief method
19473 Modeling of Radiofrequency Nerve Lesioning in Inhomogeneous Media
Authors: Nour Ismail, Sahar El Kardawy, Bassant Badwy
Abstract:
Radiofrequency (RF) lesioning of nerves have been commonly used to alleviate chronic pain, where RF current preventing transmission of pain signals through the nerve by heating the nerve causing the pain. There are some factors that affect the temperature distribution and the nerve lesion size, one of these factors is the inhomogeneities in the tissue medium. Our objective is to calculate the temperature distribution and the nerve lesion size in a nonhomogenous medium surrounding the RF electrode. A two 3-D finite element models are used to compare the temperature distribution in the homogeneous and nonhomogeneous medium. Also the effect of temperature-dependent electric conductivity on maximum temperature and lesion size is observed. Results show that the presence of a nonhomogeneous medium around the RF electrode has a valuable effect on the temperature distribution and lesion size. The dependency of electric conductivity on tissue temperature increased lesion size.Keywords: finite element model, nerve lesioning, pain relief, radiofrequency lesion
Procedia PDF Downloads 41419472 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements
Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori
Abstract:
The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.Keywords: apportionment, bias, divisor, fair, measurement
Procedia PDF Downloads 36419471 A Randomized Active Controlled Clinical Trial to Assess Clinical Efficacy and Safety of Tapentadol Nasal Spray in Moderate to Severe Post-Surgical Pain
Authors: Kamal Tolani, Sandeep Kumar, Rohit Luthra, Ankit Dadhania, Krishnaprasad K., Ram Gupta, Deepa Joshi
Abstract:
Background: Post-operative analgesia remains a clinical challenge, with central and peripheral sensitization playing a pivotal role in treatment-related complications and impaired quality of life. Centrally acting opioids offer poor risk benefit profile with increased intensity of gastrointestinal or central side effects and slow onset of clinical analgesia. The objective of this study was to assess the clinical feasibility of induction and maintenance therapy with Tapentadol Nasal Spray (NS) in moderate to severe acute post-operative pain. Methods: Phase III, randomized, active-controlled, non-inferiority clinical trial involving 294 cases who had undergone surgical procedures under general anesthesia or regional anesthesia. Post-surgery patients were randomized to receive either Tapentadol NS 45 mg or Tramadol 100mg IV as a bolus and subsequent 50 mg or 100 mg dose over 2-3 minutes. The frequency of administration of NS was at every 4-6 hours. At the end of 24 hrs, patients in the tramadol group who had a pain intensity score of ≥4 were switched to oral tramadol immediate release 100mg capsule until the pain intensity score reduced to <4. All patients who had achieved pain intensity ≤ 4 were shifted to a lower dose of either Tapentadol NS 22.5 mg or oral Tramadol immediate release 50mg capsule. The statistical analysis plan was envisaged as a non-inferiority trial involving comparison with Tramadol for Pain intensity difference at 60 minutes (PID60min), Sum of Pain intensity difference at 60 minutes (SPID60min), and Physician Global Assessment at 24 hrs (PGA24 hrs). Results: The per-protocol analyses involved 255 hospitalized cases undergoing surgical procedures. The median age of patients was 38.0 years. For the primary efficacy variables, Tapentadol NS was non-inferior to Inj/Oral Tramadol in relief of moderate to severe post-operative pain. On the basis of SPID60min, no clinically significant difference was observed between Tapentadol NS and Tramadol IV (1.73±2.24 vs. 1.64± 1.92, -0.09 [95% CI, -0.43, 0.60]). In the co-primary endpoint PGA24hrs, Tapentadol NS was non–inferior to Tramadol IV (2.12 ± 0.707 vs. 2.02 ±0.704, - 0.11[95% CI, -0.07, 0.28). However, on further assessment at 48hr, 72 hrs, and 120hrs, clinically superior pain relief was observed with the Tapentadol NS formulation that was statistically significant (p <0.05) at each of the time intervals. Secondary efficacy measures, including the onset of clinical analgesia and TOTPAR, showed non-inferiority to Tramadol. The safety profile and need for rescue medication were also similar in both the groups during the treatment period. The most common concomitant medications were anti-bacterial (98.3%). Conclusion: Tapentadol NS is a clinically feasible option for improved compliance as induction and maintenance therapy while offering a sustained and persistent patient response that is clinically meaningful in post-surgical settings.Keywords: tapentadol nasal spray, acute pain, tramadol, post-operative pain
Procedia PDF Downloads 24719470 Software Engineering Revolution Driven by Complexity Science
Abstract:
This paper introduces a new software engineering paradigm based on complexity science, called NSE (Nonlinear Software Engineering paradigm). The purpose of establishing NSE is to help software development organizations double their productivity, half their cost, and increase the quality of their products in several orders of magnitude simultaneously. NSE complies with the essential principles of complexity science. NSE brings revolutionary changes to almost all aspects in software engineering. NSE has been fully implemented with its support platform Panorama++.Keywords: complexity science, software development, software engineering, software maintenance
Procedia PDF Downloads 26319469 Solution for Thick Plate Resting on Winkler Foundation by Symplectic Geometry Method
Authors: Mei-Jie Xu, Yang Zhong
Abstract:
Based on the symplectic geometry method, the theory of Hamilton system can be applied in the analysis of problem solved using the theory of elasticity and in the solution of elliptic partial differential equations. With this technique, this paper derives the theoretical solution for a thick rectangular plate with four free edges supported on a Winkler foundation by variable separation method. In this method, the governing equation of thick plate was first transformed into state equations in the Hamilton space. The theoretical solution of this problem was next obtained by applying the method of variable separation based on the Hamilton system. Compared with traditional theoretical solutions for rectangular plates, this method has the advantage of not having to assume the form of deflection functions in the solution process. Numerical examples are presented to verify the validity of the proposed solution method.Keywords: symplectic geometry method, Winkler foundation, thick rectangular plate, variable separation method, Hamilton system
Procedia PDF Downloads 30419468 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People
Authors: Marlene Rosa, Susana Lopes
Abstract:
There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.Keywords: board game, aging, executive function, evaluation
Procedia PDF Downloads 14019467 Step Method for Solving Nonlinear Two Delays Differential Equation in Parkinson’s Disease
Authors: H. N. Agiza, M. A. Sohaly, M. A. Elfouly
Abstract:
Parkinson's disease (PD) is a heterogeneous disorder with common age of onset, symptoms, and progression levels. In this paper we will solve analytically the PD model as a non-linear delay differential equation using the steps method. The step method transforms a system of delay differential equations (DDEs) into systems of ordinary differential equations (ODEs). On some numerical examples, the analytical solution will be difficult. So we will approximate the analytical solution using Picard method and Taylor method to ODEs.Keywords: Parkinson's disease, step method, delay differential equation, two delays
Procedia PDF Downloads 20219466 The Cleavage of DNA by the Anti-Tumor Drug Bleomycin at the Transcription Start Sites of Human Genes Using Genome-Wide Techniques
Authors: Vincent Murray
Abstract:
The glycopeptide bleomycin is used in the treatment of testicular cancer, Hodgkin's lymphoma, and squamous cell carcinoma. Bleomycin damages and cleaves DNA in human cells, and this is considered to be the main mode of action for bleomycin's anti-tumor activity. In particular, double-strand breaks are thought to be the main mechanism for the cellular toxicity of bleomycin. Using Illumina next-generation DNA sequencing techniques, the genome-wide sequence specificity of bleomycin-induced double-strand breaks was determined in human cells. The degree of bleomycin cleavage was also assessed at the transcription start sites (TSSs) of actively transcribed genes and compared with non-transcribed genes. It was observed that bleomycin preferentially cleaved at the TSSs of actively transcribed human genes. There was a correlation between the degree of this enhanced cleavage at TSSs and the level of transcriptional activity. Bleomycin cleavage is also affected by chromatin structure and at TSSs, the peaks of bleomycin cleavage were approximately 200 bp apart. This indicated that bleomycin was able to detect phased nucleosomes at the TSSs of actively transcribed human genes. The genome-wide cleavage pattern of the bleomycin analogues 6′-deoxy-BLM Z and zorbamycin was also investigated in human cells. As found for bleomycin, these bleomycin analogues also preferentially cleaved at the TSSs of actively transcribed human genes. The cytotoxicity (IC₅₀ values) of these bleomycin analogues was determined. It was found that the degree of enhanced cleavage at TSSs was inversely correlated with the IC₅₀ values of the bleomycin analogues. This suggested that the level of cleavage at the TSSs of actively transcribed human genes was important for the cytotoxicity of bleomycin and analogues. Hence this study provided a deeper understanding of the cellular processes involved in the cancer chemotherapeutic activity of bleomycin.Keywords: anti-tumour activity, bleomycin analogues, chromatin structure, genome-wide study, Illumina DNA sequencing
Procedia PDF Downloads 11819465 The Willingness to Pay of People in Taiwan for Flood Protection Standard of Regions
Authors: Takahiro Katayama, Hsueh-Sheng Chang
Abstract:
Due to the global climate change, it has increased the extreme rainfall that led to serious floods around the world. In recent years, urbanization and population growth also tend to increase the number of impervious surfaces, resulting in significant loss of life and property during floods especially for the urban areas of Taiwan. In the past, the primary governmental response to floods was structural flood control and the only flood protection standards in use were the design standards. However, these design standards of flood control facilities are generally calculated based on current hydrological conditions. In the face of future extreme events, there is a high possibility to surpass existing design standards and cause damages directly and indirectly to the public. To cope with the frequent occurrence of floods in recent years, it has been pointed out that there is a need for a different standard called FPSR (Flood Protection Standard of Regions) in Taiwan. FPSR is mainly used for disaster reduction and used to ensure that hydraulic facilities draining regional flood immediately under specific return period. FPSR could convey a level of flood risk which is useful for land use planning and reflect the disaster situations that a region can bear. However, little has been reported on FPSR and its impacts to the public in Taiwan. Hence, this study proposes a quantity procedure to evaluate the FPSR. This study aimed to examine FPSR of the region and public perceptions of and knowledge about FPSR, as well as the public’s WTP (willingness to pay) for FPSR. The research is conducted via literature review and questionnaire method. Firstly, this study will review the domestic and international research on the FPSR, and provide the theoretical framework of FPSR. Secondly, CVM (Contingent Value Method) has been employed to conduct this survey and using double-bounded dichotomous choice, close-ended format elicits households WTP for raising the protection level to understand the social costs. The samplings of this study are citizens living in Taichung city, Taiwan and 700 samplings were chosen in this study. In the end, this research will continue working on surveys, finding out which factors determining WTP, and provide some recommendations for adaption policies for floods in the future.Keywords: climate change, CVM (Contingent Value Method), FPSR (Flood Protection Standard of Regions), urban flooding
Procedia PDF Downloads 24919464 Shape Management Method of Large Structure Based on Octree Space Partitioning
Authors: Gichun Cha, Changgil Lee, Seunghee Park
Abstract:
The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning
Procedia PDF Downloads 29619463 Isolation and Chemical Characterization of Residual Lignin from Areca Nut Shells
Authors: Dipti Yadav, Latha Rangan, Pinakeswar Mahanta
Abstract:
Recent fuel-development strategies to reduce oil dependency, mitigate greenhouse gas emissions, and utilize domestic resources have generated interest in the search for alternative sources of fuel supplies. Bioenergy production from lignocellulosic biomass has a great potential. Cellulose, hemicellulose and Lignin are main constituent of woods or agrowaste. In all the industries there are always left over or waste products mainly lignin, due to the heterogeneous nature of wood and pulp fibers and the heterogeneity that exists between individual fibers, no method is currently available for the quantitative isolation of native or residual lignin without the risk of structural changes during the isolation. The potential benefits from finding alternative uses of lignin are extensive, and with a double effect. Lignin can be used to replace fossil-based raw materials in a wide range of products, from plastics to individual chemical products, activated carbon, motor fuels and carbon fibers. Furthermore, if there is a market for lignin for such value-added products, the mills will also have an additional economic incentive to take measures for higher energy efficiency. In this study residual lignin were isolated from areca nut shells by acid hydrolysis and were analyzed and characterized by Fourier Transform Infrared (FTIR), LCMS and complexity of its structure investigated by NMR.Keywords: Areca nut, Lignin, wood, bioenergy
Procedia PDF Downloads 47319462 Solution of Hybrid Fuzzy Differential Equations
Authors: Mahmood Otadi, Maryam Mosleh
Abstract:
The hybrid differential equations have a wide range of applications in science and engineering. In this paper, the homotopy analysis method (HAM) is applied to obtain the series solution of the hybrid differential equations. Using the homotopy analysis method, it is possible to find the exact solution or an approximate solution of the problem. Comparisons are made between improved predictor-corrector method, homotopy analysis method and the exact solution. Finally, we illustrate our approach by some numerical example.Keywords: fuzzy number, fuzzy ODE, HAM, approximate method
Procedia PDF Downloads 51019461 Optimal Control of Volterra Integro-Differential Systems Based on Legendre Wavelets and Collocation Method
Authors: Khosrow Maleknejad, Asyieh Ebrahimzadeh
Abstract:
In this paper, the numerical solution of optimal control problem (OCP) for systems governed by Volterra integro-differential (VID) equation is considered. The method is developed by means of the Legendre wavelet approximation and collocation method. The properties of Legendre wavelet accompany with Gaussian integration method are utilized to reduce the problem to the solution of nonlinear programming one. Some numerical examples are given to confirm the accuracy and ease of implementation of the method.Keywords: collocation method, Legendre wavelet, optimal control, Volterra integro-differential equation
Procedia PDF Downloads 38719460 A Class of Third Derivative Four-Step Exponential Fitting Numerical Integrator for Stiff Differential Equations
Authors: Cletus Abhulimen, L. A. Ukpebor
Abstract:
In this paper, we construct a class of four-step third derivative exponential fitting integrator of order six for the numerical integration of stiff initial-value problems of the type: y’= f(x,y); y(x₀) =y₀. The implicit method has free parameters which allow it to be fitted automatically to exponential functions. For the purpose of effective implementation of the proposed method, we adopted the techniques of splitting the method into predictor and corrector schemes. The numerical analysis of the stability of the new method was discussed; the results show that the method is A-stable. Finally, numerical examples are presented, to show the efficiency and accuracy of the new method.Keywords: third derivative four-step, exponentially fitted, a-stable, stiff differential equations
Procedia PDF Downloads 26219459 Global Optimization: The Alienor Method Mixed with Piyavskii-Shubert Technique
Authors: Guettal Djaouida, Ziadi Abdelkader
Abstract:
In this paper, we study a coupling of the Alienor method with the algorithm of Piyavskii-Shubert. The classical multidimensional global optimization methods involves great difficulties for their implementation to high dimensions. The Alienor method allows to transform a multivariable function into a function of a single variable for which it is possible to use efficient and rapid method for calculating the the global optimum. This simplification is based on the using of a reducing transformation called Alienor.Keywords: global optimization, reducing transformation, α-dense curves, Alienor method, Piyavskii-Shubert algorithm
Procedia PDF Downloads 49919458 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software
Authors: Carlos Gonzalez
Abstract:
This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.Keywords: internet, secure software, threats, cryptography process
Procedia PDF Downloads 33119457 Geographic Information System and Ecotourism Sites Identification of Jamui District, Bihar, India
Authors: Anshu Anshu
Abstract:
In the red corridor famed for the Left Wing Extremism, lies small district of Jamui in Bihar, India. The district lies at 24º20´ N latitude and 86º13´ E longitude, covering an area of 3,122.8 km2 The undulating topography, with widespread forests provides pristine environment for invigorating experience of tourists. Natural landscape in form of forests, wildlife, rivers, and cultural landscape dotted with historical and religious places is highly purposive for tourism. The study is primarily related to the identification of potential ecotourism sites, using Geographic Information System. Data preparation, analysis and finally identification of ecotourism sites is done. Secondary data used is Survey of India Topographical Sheets with R.F.1:50,000 covering the area of Jamui district. District Census Handbook, Census of India, 2011; ERDAS Imagine and Arc View is used for digitization and the creation of DEM’s (Digital Elevation Model) of the district, depicting the relief and topography and generate thematic maps. The thematic maps have been refined using the geo-processing tools. Buffer technique has been used for the accessibility analysis. Finally, all the maps, including the Buffer maps were overlaid to find out the areas which have potential for the development of ecotourism sites in the Jamui district. Spatial data - relief, slopes, settlements, transport network and forests of Jamui District were marked and identified, followed by Buffer Analysis that was used to find out the accessibility of features like roads, railway stations to the sites available for the development of ecotourism destinations. Buffer analysis is also carried out to get the spatial proximity of major river banks, lakes, and dam sites to be selected for promoting sustainable ecotourism. Overlay Analysis is conducted using the geo-processing tools. Digital Terrain Model (DEM) generated and relevant themes like roads, forest areas and settlements were draped on the DEM to make an assessment of the topography and other land uses of district to delineate potential zones of ecotourism development. Development of ecotourism in Jamui faces several challenges. The district lies in the portion of Bihar that is part of ‘red corridor’ of India. The hills and dense forests are the prominent hideouts and training ground for the extremists. It is well known that any kind of political instability, war, acts of violence directly influence the travel propensity and hinders all kind of non-essential travels to these areas. The development of ecotourism in the district can bring change and overall growth in this area with communities getting more involved in economically sustainable activities. It is a known fact that poverty and social exclusion are the main force that pushes people, resorting towards violence. All over the world tourism has been used as a tool to eradicate poverty and generate good will among people. Tourism, in sustainable form should be promoted in the district to integrate local communities in the development process and to distribute fruits of development with equity.Keywords: buffer analysis, digital elevation model, ecotourism, red corridor
Procedia PDF Downloads 25719456 Formulation of Corrector Methods from 3-Step Hybid Adams Type Methods for the Solution of First Order Ordinary Differential Equation
Authors: Y. A. Yahaya, Ahmad Tijjani Asabe
Abstract:
This paper focuses on the formulation of 3-step hybrid Adams type method for the solution of first order differential equation (ODE). The methods which was derived on both grid and off grid points using multistep collocation schemes and also evaluated at some points to produced Block Adams type method and Adams moulton method respectively. The method with the highest order was selected to serve as the corrector. The convergence was valid and efficient. The numerical experiments were carried out and reveal that hybrid Adams type methods performed better than the conventional Adams moulton method.Keywords: adam-moulton type (amt), corrector method, off-grid, block method, convergence analysis
Procedia PDF Downloads 62319455 Estimation of Train Operation Using an Exponential Smoothing Method
Authors: Taiyo Matsumura, Kuninori Takahashi, Takashi Ono
Abstract:
The purpose of this research is to improve the convenience of waiting for trains at level crossings and stations and to prevent accidents resulting from forcible entry into level crossings, by providing level crossing users and passengers with information that tells them when the next train will pass through or arrive. For this paper, we proposed methods for estimating operation by means of an average value method, variable response smoothing method, and exponential smoothing method, on the basis of open data, which has low accuracy, but for which performance schedules are distributed in real time. We then examined the accuracy of the estimations. The results showed that the application of an exponential smoothing method is valid.Keywords: exponential smoothing method, open data, operation estimation, train schedule
Procedia PDF Downloads 38719454 The Use of Bleomycin and Analogues to Probe the Chromatin Structure of Human Genes
Authors: Vincent Murray
Abstract:
The chromatin structure at the transcription start sites (TSSs) of genes is very important in the control of gene expression. In order for gene expression to occur, the chromatin structure at the TSS has to be altered so that the transcriptional machinery can be assembled and RNA transcripts can be produced. In particular, the nucleosome structure and positioning around the TSS has to be changed. Bleomycin is utilized as an anti-tumor agent to treat Hodgkin's lymphoma, squamous cell carcinoma, and testicular cancer. Bleomycin produces DNA damage in human cells and DNA strand breaks, especially double-strand breaks, are thought to be responsible for the cancer chemotherapeutic activity of bleomycin. Bleomycin is a large glycopeptide with molecular weight of approximately 1500 Daltons and hence its DNA strand cleavage activity can be utilized as a probe of chromatin structure. In this project, Illumina next-generation DNA sequencing technology was used to determine the position of DNA double-strand breaks at the TSSs of genes in intact cells. In this genome-wide study, it was found that bleomycin cleavage preferentially occurred at the TSSs of actively transcribed human genes in comparison with non-transcribed genes. There was a correlation between the level of enhanced bleomycin cleavage at TSSs and the degree of transcriptional activity. In addition, bleomycin was able to determine the position of nucleosomes at the TSSs of human genes. Bleomycin analogues were also utilized as probes of chromatin structure at the TSSs of human genes. In a similar manner to bleomycin, the bleomycin analogues 6′-deoxy-BLM Z and zorbamycin preferentially cleaved at the TSSs of human genes. Interestingly this degree of enhanced TSS cleavage inversely correlated with the cytotoxicity (IC50 values) of BLM analogues. This indicated that the degree of cleavage by bleomycin analogues at the TSSs of human genes was very important in the cytotoxicity of bleomycin and analogues. It also provided a deeper insight into the mechanism of action of this cancer chemotherapeutic agent since actively transcribed genes were preferentially targeted.Keywords: anti-cancer activity, chromatin structure, cytotoxicity, gene expression, next-generation DNA sequencing
Procedia PDF Downloads 11419453 A Review on Higher-Order Spline Techniques for Solving Burgers Equation Using B-Spline Methods and Variation of B-Spline Techniques
Authors: Maryam Khazaei Pool, Lori Lewis
Abstract:
This is a summary of articles based on higher order B-splines methods and the variation of B-spline methods such as Quadratic B-spline Finite Elements Method, Exponential Cubic B-Spline Method, Septic B-spline Technique, Quintic B-spline Galerkin Method, and B-spline Galerkin Method based on the Quadratic B-spline Galerkin method (QBGM) and Cubic B-spline Galerkin method (CBGM). In this paper, we study the B-spline methods and variations of B-spline techniques to find a numerical solution to the Burgers’ equation. A set of fundamental definitions, including Burgers equation, spline functions, and B-spline functions, are provided. For each method, the main technique is discussed as well as the discretization and stability analysis. A summary of the numerical results is provided, and the efficiency of each method presented is discussed. A general conclusion is provided where we look at a comparison between the computational results of all the presented schemes. We describe the effectiveness and advantages of these methods.Keywords: Burgers’ equation, Septic B-spline, modified cubic B-spline differential quadrature method, exponential cubic B-spline technique, B-spline Galerkin method, quintic B-spline Galerkin method
Procedia PDF Downloads 12319452 Mechanical Characterization of Banana by Inverse Analysis Method Combined with Indentation Test
Authors: Juan F. P. Ramírez, Jésica A. L. Isaza, Benjamín A. Rojano
Abstract:
This study proposes a novel use of a method to determine the mechanical properties of fruits by the use of the indentation tests. The method combines experimental results with a numerical finite elements model. The results presented correspond to a simplified numerical modeling of banana. The banana was assumed as one-layer material with an isotropic linear elastic mechanical behavior, the Young’s modulus found is 0.3Mpa. The method will be extended to multilayer models in further studies.Keywords: finite element method, fruits, inverse analysis, mechanical properties
Procedia PDF Downloads 35619451 Linear Array Geometry Synthesis with Minimum Sidelobe Level and Null Control Using Taguchi Method
Authors: Amara Prakasa Rao, N. V. S. N. Sarma
Abstract:
This paper describes the synthesis of linear array geometry with minimum sidelobe level and null control using the Taguchi method. Based on the concept of the orthogonal array, Taguchi method effectively reduces the number of tests required in an optimization process. Taguchi method has been successfully applied in many fields such as mechanical, chemical engineering, power electronics, etc. Compared to other evolutionary methods such as genetic algorithms, simulated annealing and particle swarm optimization, the Taguchi method is much easier to understand and implement. It requires less computational/iteration processing to optimize the problem. Different cases are considered to illustrate the performance of this technique. Simulation results show that this method outperforms the other evolution algorithms (like GA, PSO) for smart antenna systems design.Keywords: array factor, beamforming, null placement, optimization method, orthogonal array, Taguchi method, smart antenna system
Procedia PDF Downloads 39219450 Acid Fuchsin Dye Based PMMA Film for Holographic Investigations
Authors: G. Vinitha, A. Ramalingam
Abstract:
In view of a possible application in optical data storage devices, diffraction grating efficiency of an organic dye, Acid Fuchsin doped in PMMA matrix was studied under excitation with CW diode pumped Nd: YAG laser at 532 nm. The open aperture Z-scan of dye doped polymer displayed saturable absorption and the closed aperture Z-scan of the samples exhibited negative nonlinearity. The diffraction efficiency of the grating is the ratio of the intensity of the first order diffracted power to the incident read beam power. The dye doped polymer films were found to be good media for recording. It is observed that the formation of gratings strongly depend on the concentration of dye in the polymer film, the intensity ratios of the writing beams and the angle between the writing beams. It has been found that efficient writing can be made at an angle of 20° and when the intensity ratio of the writing beams is unity.Keywords: diffraction efficiency, nonlinear optical material, saturable absorption, surface-relief-gratings
Procedia PDF Downloads 29819449 Residual Power Series Method for System of Volterra Integro-Differential Equations
Authors: Zuhier Altawallbeh
Abstract:
This paper investigates the approximate analytical solutions of general form of Volterra integro-differential equations system by using the residual power series method (for short RPSM). The proposed method produces the solutions in terms of convergent series requires no linearization or small perturbation and reproduces the exact solution when the solution is polynomial. Some examples are given to demonstrate the simplicity and efficiency of the proposed method. Comparisons with the Laplace decomposition algorithm verify that the new method is very effective and convenient for solving system of pantograph equations.Keywords: integro-differential equation, pantograph equations, system of initial value problems, residual power series method
Procedia PDF Downloads 41719448 Multimodal Biometric Cryptography Based Authentication in Cloud Environment to Enhance Information Security
Authors: D. Pugazhenthi, B. Sree Vidya
Abstract:
Cloud computing is one of the emerging technologies that enables end users to use the services of cloud on ‘pay per usage’ strategy. This technology grows in a fast pace and so is its security threat. One among the various services provided by cloud is storage. In this service, security plays a vital factor for both authenticating legitimate users and protection of information. This paper brings in efficient ways of authenticating users as well as securing information on the cloud. Initial phase proposed in this paper deals with an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. Unique identification and slow intrusive formulates an advanced reliability on user-behaviour based biometrics than conventional means of password authentication. By biometric systems, the accounts are accessed only by a legitimate user and not by a nonentity. The biometric templates employed here do not include single trait but multiple, viz., iris and finger prints. The coordinating stage of the authentication system functions on Ensemble Support Vector Machine (SVM) and optimization by assembling weights of base SVMs for SVM ensemble after individual SVM of ensemble is trained by the Artificial Fish Swarm Algorithm (AFSA). Thus it helps in generating a user-specific secure cryptographic key of the multimodal biometric template by fusion process. Data security problem is averted and enhanced security architecture is proposed using encryption and decryption system with double key cryptography based on Fuzzy Neural Network (FNN) for data storing and retrieval in cloud computing . The proposing scheme aims to protect the records from hackers by arresting the breaking of cipher text to original text. This improves the authentication performance that the proposed double cryptographic key scheme is capable of providing better user authentication and better security which distinguish between the genuine and fake users. Thus, there are three important modules in this proposed work such as 1) Feature extraction, 2) Multimodal biometric template generation and 3) Cryptographic key generation. The extraction of the feature and texture properties from the respective fingerprint and iris images has been done initially. Finally, with the help of fuzzy neural network and symmetric cryptography algorithm, the technique of double key encryption technique has been developed. As the proposed approach is based on neural networks, it has the advantage of not being decrypted by the hacker even though the data were hacked already. The results prove that authentication process is optimal and stored information is secured.Keywords: artificial fish swarm algorithm (AFSA), biometric authentication, decryption, encryption, fingerprint, fusion, fuzzy neural network (FNN), iris, multi-modal, support vector machine classification
Procedia PDF Downloads 25919447 A Method for Improving the Embedded Runge Kutta Fehlberg 4(5)
Authors: Sunyoung Bu, Wonkyu Chung, Philsu Kim
Abstract:
In this paper, we introduce a method for improving the embedded Runge-Kutta-Fehlberg 4(5) method. At each integration step, the proposed method is comprised of two equations for the solution and the error, respectively. This solution and error are obtained by solving an initial value problem whose solution has the information of the error at each integration step. The constructed algorithm controls both the error and the time step size simultaneously and possesses a good performance in the computational cost compared to the original method. For the assessment of the effectiveness, EULR problem is numerically solved.Keywords: embedded Runge-Kutta-Fehlberg method, initial value problem, EULR problem, integration step
Procedia PDF Downloads 46119446 Seat Assignment Model for Student Admissions Process at Saudi Higher Education Institutions
Authors: Mohammed Salem Alzahrani
Abstract:
In this paper, student admission process is studied to optimize the assignment of vacant seats with three main objectives. Utilizing all vacant seats, satisfying all program of study admission requirements and maintaining fairness among all candidates are the three main objectives of the optimization model. Seat Assignment Method (SAM) is used to build the model and solve the optimization problem with help of Northwest Coroner Method and Least Cost Method. A closed formula is derived for applying the priority of assigning seat to candidate based on SAM.Keywords: admission process model, assignment problem, Hungarian Method, Least Cost Method, Northwest Corner Method, SAM
Procedia PDF Downloads 49419445 A Succinct Method for Allocation of Reactive Power Loss in Deregulated Scenario
Authors: J. S. Savier
Abstract:
Real power is the component power which is converted into useful energy whereas reactive power is the component of power which cannot be converted to useful energy but it is required for the magnetization of various electrical machineries. If the reactive power is compensated at the consumer end, the need for reactive power flow from generators to the load can be avoided and hence the overall power loss can be reduced. In this scenario, this paper presents a succinct method called JSS method for allocation of reactive power losses to consumers connected to radial distribution networks in a deregulated environment. The proposed method has the advantage that no assumptions are made while deriving the reactive power loss allocation method.Keywords: deregulation, reactive power loss allocation, radial distribution systems, succinct method
Procedia PDF Downloads 37519444 Modification of Underwood's Equation to Calculate Minimum Reflux Ratio for Column with One Side Stream Upper Than Feed
Authors: S. Mousavian, A. Abedianpour, A. Khanmohammadi, S. Hematian, Gh. Eidi Veisi
Abstract:
Distillation is one of the most important and utilized separation methods in the industrial practice. There are different ways to design of distillation column. One of these ways is short cut method. In short cut method, material balance and equilibrium are employed to calculate number of tray in distillation column. There are different methods that are classified in short cut method. One of these methods is Fenske-Underwood-Gilliland method. In this method, minimum reflux ratio should be calculated by underwood equation. Underwood proposed an equation that is useful for simple distillation column with one feed and one top and bottom product. In this study, underwood method is developed to predict minimum reflux ratio for column with one side stream upper than feed. The result of this model compared with McCabe-Thiele method. The result shows that proposed method able to calculate minimum reflux ratio with very small error.Keywords: minimum reflux ratio, side stream, distillation, Underwood’s method
Procedia PDF Downloads 404