Search results for: smooth and nonsmooth functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2902

Search results for: smooth and nonsmooth functions

2512 The Juxtaposition of Home in Toni Morrison's Home: Ironic Functions as Trauma and Healing

Authors: Imas Istiani

Abstract:

The concept of home is usually closely related to the place of safety and security. For people who have travelled far and long, they long to be united with home to feel safe, secure and comfortable. However, for some people, especially for veterans, home cannot offer them those feelings, on the contrary, it can give them the sense of insecurity as well as guilty. Thus, its juxtaposed concept can also put home as an uncanny place that represses and haunt its occupant. As for veterans, 'survivor guilt' overpowers them in the way that it will be hard for them to embrace the comfort that home offers. In Home, Toni Morrison poignantly depicts Frank’s life upon returning from the war. Burdened with his traumatic experiences, Frank finds home full with terror, guilt, fear, grief, and loss. Using Dominick laCapra’s 'Trauma Theory,' the study finds that Frank works through his trauma by being able to distinguish between past and present so that he can overcome those repressed feelings. Aside from his inner healing power, Frank digests the process of working through with the help of home and community, as proposed by Evelyn Jaffe Schreiber claiming that community can help survivors to heal from traumatic experiences. Thus, Home has two juxtaposed functions; both as traumatizing and healing place.

Keywords: trauma, healing, home, trauma theory

Procedia PDF Downloads 222
2511 Exploring Methods for Urbanization of 'Village in City' in China: A Case Study of Hangzhou

Authors: Yue Wang, Fan Chen

Abstract:

After the economic reform in 1978, the urbanization in China has grown fast. It urged cities to expand in an unprecedented high speed. Villages around were annexed unprepared, and it turned out to be a new type of community called 'village in city.' Two things happened here. First, the locals gave up farming and turned to secondary industry and tertiary industry, as a result of losing their land. Secondly, attracted by the high income in cities and low rent here, plenty of migrants came into the community. This area is important to a city in rapid growth for providing a transitional zone. But thanks to its passivity and low development, 'village in city' has caused lots of trouble to the city. Densities of population and construction are both high, while facilities are severely inadequate. Unplanned and illegal structures are built, which creates a complex mixed-function area and leads to a bad residential area. Besides, the locals have a strong property right consciousness for the land. It holds back the transformation and development of the community. Although the land capitalization can bring significant benefits, it’s inappropriate to make a great financial compensation to the locals, and considering the large population of city migrants, it’s important to explore the relationship among the 'village in city,' city immigrants and the city itself. Taking the example of Hangzhou, this paper analyzed the developing process, functions spatial distribution, industrial structure and current traffic system of 'village in city.' Above the research on the community, this paper put forward a common method to make urban planning through the following ways: adding city functions, building civil facilities, re-planning functions spatial distribution, changing the constitution of local industry and planning new traffic system. Under this plan, 'village in city' finally can be absorbed into cities and make its own contribution to the urbanization.

Keywords: China, city immigrant, urbanization, village in city

Procedia PDF Downloads 196
2510 An Appraisal of Maintenance Management Practices in Federal University Dutse and Jigawa State Polytechnic Dutse, Nigeria

Authors: Aminu Mubarak Sadis

Abstract:

This study appraised the maintenance management practice in Federal University Dutse and Jigawa State Polytechnic Dutse, in Nigeria. The Physical Planning, Works and Maintenance Departments of the two Higher Institutions (Federal University Dutse and Jigawa State Polytechnic) are responsible for production and maintenance management of their physical assets. Over–enrollment problem has been a common feature in the higher institutions in Nigeria, Data were collected by the administered questionnaires and subsequent oral interview to authenticate the completed questionnaires. Random sampling techniques was used in selecting 150 respondents across the various institutions (Federal University Dutse and Jigawa State Polytechnic Dutse). Data collected was analyzed using Statistical Package for Social Science (SPSS) and t-test statistical techniques The conclusion was that maintenance management activities are yet to be given their appropriate attention on functions of the university and polytechnic which are crucial to improving teaching, learning and research. The unit responsible for maintenance and managing facilities should focus on their stated functions and effect changes were possible.

Keywords: appraisal, maintenance management, university, Polytechnic, practices

Procedia PDF Downloads 221
2509 Speech and Swallowing Function after Tonsillo-Lingual Sulcus Resection with PMMC Flap Reconstruction: A Case Study

Authors: K. Rhea Devaiah, B. S. Premalatha

Abstract:

Background: Tonsillar Lingual sulcus is the area between the tonsils and the base of the tongue. The surgical resection of the lesions in the head and neck results in changes in speech and swallowing functions. The severity of the speech and swallowing problem depends upon the site and extent of the lesion, types and extent of surgery and also the flexibility of the remaining structures. Need of the study: This paper focuses on the importance of speech and swallowing rehabilitation in an individual with the lesion in the Tonsillar Lingual Sulcus and post-operative functions. Aim: Evaluating the speech and swallow functions post-intensive speech and swallowing rehabilitation. The objectives are to evaluate the speech intelligibility and swallowing functions after intensive therapy and assess the quality of life. Method: The present study describes a report of an individual aged 47years male, with the diagnosis of basaloid squamous cell carcinoma, left tonsillar lingual sulcus (pT2n2M0) and underwent wide local excision with left radical neck dissection with PMMC flap reconstruction. Post-surgery the patient came with a complaint of reduced speech intelligibility, and difficulty in opening the mouth and swallowing. Detailed evaluation of the speech and swallowing functions were carried out such as OPME, articulation test, speech intelligibility, different phases of swallowing and trismus evaluation. Self-reported questionnaires such as SHI-E(Speech handicap Index- Indian English), DHI (Dysphagia handicap Index) and SESEQ -K (Self Evaluation of Swallowing Efficiency in Kannada) were also administered to know what the patient feels about his problem. Based on the evaluation, the patient was diagnosed with pharyngeal phase dysphagia associated with trismus and reduced speech intelligibility. Intensive speech and swallowing therapy was advised weekly twice for the duration of 1 hour. Results: Totally the patient attended 10 intensive speech and swallowing therapy sessions. Results indicated misarticulation of speech sounds such as lingua-palatal sounds. Mouth opening was restricted to one finger width with difficulty chewing, masticating, and swallowing the bolus. Intervention strategies included Oro motor exercise, Indirect swallowing therapy, usage of a trismus device to facilitate mouth opening, and change in the food consistency to help to swallow. A practice session was held with articulation drills to improve the production of speech sounds and also improve speech intelligibility. Significant changes in articulatory production and speech intelligibility and swallowing abilities were observed. The self-rated quality of life measures such as DHI, SHI and SESE Q-K revealed no speech handicap and near-normal swallowing ability indicating the improved QOL after the intensive speech and swallowing therapy. Conclusion: Speech and swallowing therapy post carcinoma in the tonsillar lingual sulcus is crucial as the tongue plays an important role in both speech and swallowing. The role of Speech-language and swallowing therapists in oral cancer should be highlighted in treating these patients and improving the overall quality of life. With intensive speech-language and swallowing therapy post-surgery for oral cancer, there can be a significant change in the speech outcome and swallowing functions depending on the site and extent of lesions which will thereby improve the individual’s QOL.

Keywords: oral cancer, speech and swallowing therapy, speech intelligibility, trismus, quality of life

Procedia PDF Downloads 83
2508 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover

Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae

Abstract:

Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.

Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling

Procedia PDF Downloads 103
2507 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People

Authors: Marlene Rosa, Susana Lopes

Abstract:

There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.

Keywords: board game, aging, executive function, evaluation

Procedia PDF Downloads 120
2506 Anisotropic Approach for Discontinuity Preserving in Optical Flow Estimation

Authors: Pushpendra Kumar, Sanjeev Kumar, R. Balasubramanian

Abstract:

Estimation of optical flow from a sequence of images using variational methods is one of the most successful approach. Discontinuity between different motions is one of the challenging problem in flow estimation. In this paper, we design a new anisotropic diffusion operator, which is able to provide smooth flow over a region and efficiently preserve discontinuity in optical flow. This operator is designed on the basis of intensity differences of the pixels and isotropic operator using exponential function. The combination of these are used to control the propagation of flow. Experimental results on the different datasets verify the robustness and accuracy of the algorithm and also validate the effect of anisotropic operator in the discontinuity preserving.

Keywords: optical flow, variational methods, computer vision, anisotropic operator

Procedia PDF Downloads 845
2505 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 148
2504 Design, Numerical Simulation, Fabrication and Physical Experimentation of the Tesla’s Cohesion Type Bladeless Turbine

Authors: M.Sivaramakrishnaiah, D. S .Nasan, P. V. Subhanjeneyulu, J. A. Sandeep Kumar, N. Sreenivasulu, B. V. Amarnath Reddy, B. Veeralingam

Abstract:

Design, numerical simulation, fabrication, and physical experimentation of the Tesla’s Bladeless centripetal turbine for generating electrical power are presented in this research paper. 29 Pressurized air combined with water via a nozzle system is made to pass tangentially through a set of parallel smooth discs surfaces, which impart rotational motion to the discs fastened common shaft for the power generation. The power generated depends upon the fluid speed parameter leaving the nozzle inlet. Physically due to laminar boundary layer phenomena at smooth disc surface, the high speed fluid layers away from the plate moving against the low speed fluid layers nearer to the plate develop a tangential drag from the viscous shear forces. This compels the nearer layers to drag along with the high layers causing the disc to spin. Solid Works design software and fluid mechanics and machine elements design theories was used to compute mechanical design specifications of turbine parts like 48 mm diameter discs, common shaft, central exhaust, plenum chamber, swappable nozzle inlets, etc. Also, ANSYS CFX 2018 was used for the numerical 2 simulation of the physical phenomena encountered in the turbine working. When various numerical simulation and physical experimental results were verified, there is good agreement between them 6, both quantitatively and qualitatively. The sources of input and size of the blades may affect the power generated and turbine efficiency, respectively. The results may change if there is a change in the fluid flowing between the discs. The inlet fluid pressure versus turbine efficiency and the number of discs versus turbine power studies based on both results were carried out to develop the 8 relationships between the inlet and outlet parameters of the turbine. The present research work obtained the turbine efficiency in the range of 7-10%, and for this range; the electrical power output generated was 50-60 W.

Keywords: tesla turbine, cohesion type bladeless turbine, boundary layer theory, cohesion type bladeless turbine, tangential fluid flow, viscous and adhesive forces, plenum chamber, pico hydro systems

Procedia PDF Downloads 61
2503 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 174
2502 Research on Evaluation of Renewable Energy Technology Innovation Strategy Based on PMC Index Model

Authors: Xue Wang, Liwei Fan

Abstract:

Renewable energy technology innovation is an important way to realize the energy transformation. Our government has issued a series of policies to guide and support the development of renewable energy. The implementation of these policies will affect the further development, utilization and technological innovation of renewable energy. In this context, it is of great significance to systematically sort out and evaluate the renewable energy technology innovation policy for improving the existing policy system. Taking the 190 renewable energy technology innovation policies issued during 2005-2021 as a sample, from the perspectives of policy issuing departments and policy keywords, it uses text mining and content analysis methods to analyze the current situation of the policies and conduct a semantic network analysis to identify the core issuing departments and core policy topic words; A PMC (Policy Modeling Consistency) index model is built to quantitatively evaluate the selected policies, analyze the overall pros and cons of the policy through its PMC index, and reflect the PMC value of the model's secondary index The core departments publish policies and the performance of each dimension of the policies related to the core topic headings. The research results show that Renewable energy technology innovation policies focus on synergy between multiple departments, while the distribution of the issuers is uneven in terms of promulgation time; policies related to different topics have their own emphasis in terms of policy types, fields, functions, and support measures, but It still needs to be improved, such as the lack of policy forecasting and supervision functions, the lack of attention to product promotion, and the relatively single support measures. Finally, this research puts forward policy optimization suggestions in terms of promoting joint policy release, strengthening policy coherence and timeliness, enhancing the comprehensiveness of policy functions, and enriching incentive measures for renewable energy technology innovation.

Keywords: renewable energy technology innovation, content analysis, policy evaluation, PMC index model

Procedia PDF Downloads 41
2501 Development of Earthquake and Typhoon Loss Models for Japan, Specifically Designed for Underwriting and Enterprise Risk Management Cycles

Authors: Nozar Kishi, Babak Kamrani, Filmon Habte

Abstract:

Natural hazards such as earthquakes and tropical storms, are very frequent and highly destructive in Japan. Japan experiences, every year on average, more than 10 tropical cyclones that come within damaging reach, and earthquakes of moment magnitude 6 or greater. We have developed stochastic catastrophe models to address the risk associated with the entire suite of damaging events in Japan, for use by insurance, reinsurance, NGOs and governmental institutions. KCC’s (Karen Clark and Company) catastrophe models are procedures constituted of four modular segments: 1) stochastic events sets that would represent the statistics of the past events, hazard attenuation functions that could model the local intensity, vulnerability functions that would address the repair need for local buildings exposed to the hazard, and financial module addressing policy conditions that could estimates the losses incurring as result of. The events module is comprised of events (faults or tracks) with different intensities with corresponding probabilities. They are based on the same statistics as observed through the historical catalog. The hazard module delivers the hazard intensity (ground motion or wind speed) at location of each building. The vulnerability module provides library of damage functions that would relate the hazard intensity to repair need as percentage of the replacement value. The financial module reports the expected loss, given the payoff policies and regulations. We have divided Japan into regions with similar typhoon climatology, and earthquake micro-zones, within each the characteristics of events are similar enough for stochastic modeling. For each region, then, a set of stochastic events is developed that results in events with intensities corresponding to annual occurrence probabilities that are of interest to financial communities; such as 0.01, 0.004, etc. The intensities, corresponding to these probabilities (called CE, Characteristics Events) are selected through a superstratified sampling approach that is based on the primary uncertainty. Region specific hazard intensity attenuation functions followed by vulnerability models leads to estimation of repair costs. Extensive economic exposure model addresses all local construction and occupancy types, such as post-linter Shinand Okabe wood, as well as concrete confined in steel, SRC (Steel-Reinforced Concrete), high-rise.

Keywords: typhoon, earthquake, Japan, catastrophe modelling, stochastic modeling, stratified sampling, loss model, ERM

Procedia PDF Downloads 239
2500 On the Grid Technique by Approximating the Derivatives of the Solution of the Dirichlet Problems for (1+1) Dimensional Linear Schrodinger Equation

Authors: Lawrence A. Farinola

Abstract:

Four point implicit schemes for the approximation of the first and pure second order derivatives for the solution of the Dirichlet problem for one dimensional Schrodinger equation with respect to the time variable t were constructed. Also, special four-point implicit difference boundary value problems are proposed for the first and pure second derivatives of the solution with respect to the spatial variable x. The Grid method is also applied to the mixed second derivative of the solution of the Linear Schrodinger time-dependent equation. It is assumed that the initial function belongs to the Holder space C⁸⁺ᵃ, 0 < α < 1, the Schrodinger wave function given in the Schrodinger equation is from the Holder space Cₓ,ₜ⁶⁺ᵃ, ³⁺ᵃ/², the boundary functions are from C⁴⁺ᵃ, and between the initial and the boundary functions the conjugation conditions of orders q = 0,1,2,3,4 are satisfied. It is proven that the solution of the proposed difference schemes converges uniformly on the grids of the order O(h²+ k) where h is the step size in x and k is the step size in time. Numerical experiments are illustrated to support the analysis made.

Keywords: approximation of derivatives, finite difference method, Schrödinger equation, uniform error

Procedia PDF Downloads 102
2499 Sociolinguistic and Classroom Functions of Using Code-Switching in CLIL Context

Authors: Khatuna Buskivadze

Abstract:

The aim of the present study is to investigate the sociolinguistic and classroom functions and frequency of Teacher’s Code Switching (CS) in the Content and Language Integrated (CLIL) Lesson. Nowadays, Georgian society struggles to become the part of the European world, the English language itself plays a role in forming new generations with European values. Based on our research conducted in 2019, out of all 114 private schools in Tbilisi, full- programs of CLIL are taught in 7 schools, while only some subjects using CLIL are conducted in 3 schools. The goal of the former research was to define the features of Content and Language Integrated learning (CLIL) methodology within the process of teaching English on the Example of Georgian private high schools. Taking the Georgian reality and cultural features into account, the modified version of the questionnaire, based on the classification of using CS in ESL Classroom proposed By Ferguson (2009) was used. The qualitative research revealed students’ and teacher’s attitudes towards teacher’s code-switching in CLIL lesson. Both qualitative and quantitative research were conducted: the observations of the teacher’s lessons (Recording of T’s online lessons), interview and the questionnaire among Math’s T’s 20 high school students. We came to the several conclusions, some of them are given here: Math’s teacher’s CS behavior mostly serves (1) the conversational function of interjection; (2) the classroom functions of introducing unfamiliar materials and topics, explaining difficult concepts, maintaining classroom discipline and the structure of the lesson; The teacher and 13 students have negative attitudes towards using only Georgian in teaching Math. The higher level of English is the more negative is attitude towards using Georgian in the classroom. Although all the students were Georgian, their competence in English is higher than in Georgian, therefore they consider English as an inseparable part of their identities. The overall results of the case study of teaching Math (Educational discourse) in one of the private schools in Tbilisi will be presented at the conference.

Keywords: attitudes, bilingualism, code-switching, CLIL, conversation analysis, interactional sociolinguistics.

Procedia PDF Downloads 134
2498 Design and Development of Data Visualization in 2D and 3D Space Using Front-End Technologies

Authors: Sourabh Yaduvanshi, Varsha Namdeo, Namrata Yaduvanshi

Abstract:

This study delves into the design and development intricacies of crafting detailed 2D bar charts via d3.js, recognizing its limitations in generating 3D visuals within the DOM. The study combines three.js with d3.js, facilitating a smooth evolution from 2D to immersive 3D representations. This fusion epitomizes the synergy between front-end technologies, expanding horizons in data visualization. Beyond technical expertise, it symbolizes a creative convergence, pushing boundaries in visual representation. The abstract illuminates methodologies, unraveling the intricate integration of this fusion and guiding enthusiasts. It narrates a compelling story of transcending 2D constraints, propelling data visualization into captivating three-dimensional realms, and igniting creativity in front-end visualization endeavors.

Keywords: design, development, front-end technologies, visualization

Procedia PDF Downloads 47
2497 Simultaneous Determination of Methotrexate and Aspirin Using Fourier Transform Convolution Emission Data under Non-Parametric Linear Regression Method

Authors: Marwa A. A. Ragab, Hadir M. Maher, Eman I. El-Kimary

Abstract:

Co-administration of methotrexate (MTX) and aspirin (ASP) can cause a pharmacokinetic interaction and a subsequent increase in blood MTX concentrations which may increase the risk of MTX toxicity. Therefore, it is important to develop a sensitive, selective, accurate and precise method for their simultaneous determination in urine. A new hybrid chemometric method has been applied to the emission response data of the two drugs. Spectrofluorimetric method for determination of MTX through measurement of its acid-degradation product, 4-amino-4-deoxy-10-methylpteroic acid (4-AMP), was developed. Moreover, the acid-catalyzed degradation reaction enables the spectrofluorimetric determination of ASP through the formation of its active metabolite salicylic acid (SA). The proposed chemometric method deals with convolution of emission data using 8-points sin xi polynomials (discrete Fourier functions) after the derivative treatment of these emission data. The first and second derivative curves (D1 & D2) were obtained first then convolution of these curves was done to obtain first and second derivative under Fourier functions curves (D1/FF) and (D2/FF). This new application was used for the resolution of the overlapped emission bands of the degradation products of both drugs to allow their simultaneous indirect determination in human urine. Not only this chemometric approach was applied to the emission data but also the obtained data were subjected to non-parametric linear regression analysis (Theil’s method). The proposed method was fully validated according to the ICH guidelines and it yielded linearity ranges as follows: 0.05-0.75 and 0.5-2.5 µg mL-1 for MTX and ASP respectively. It was found that the non-parametric method was superior over the parametric one in the simultaneous determination of MTX and ASP after the chemometric treatment of the emission spectra of their degradation products. The work combines the advantages of derivative and convolution using discrete Fourier function together with the reliability and efficacy of the non-parametric analysis of data. The achieved sensitivity along with the low values of LOD (0.01 and 0.06 µg mL-1) and LOQ (0.04 and 0.2 µg mL-1) for MTX and ASP respectively, by the second derivative under Fourier functions (D2/FF) were promising and guarantee its application for monitoring the two drugs in patients’ urine samples.

Keywords: chemometrics, emission curves, derivative, convolution, Fourier transform, human urine, non-parametric regression, Theil’s method

Procedia PDF Downloads 406
2496 Optimal Design of Propellant Grain Shape Based on Structural Strength Analysis

Authors: Chen Xiong, Tong Xin, Li Hao, Xu Jin-Sheng

Abstract:

Experiment and simulation researches on the structural integrity of propellant grain in solid rocket motor (SRM) with high volumetric fraction were conducted. First, by using SRM parametric modeling functions with secondary development tool Python of ABAQUS, the three dimensional parameterized modeling programs of star shaped grain, wheel shaped grain and wing cylindrical grain were accomplished. Then, the mechanical properties under different loads for star shaped grain were obtained with the application of automatically established finite element model in ABAQUS. Next, several optimization algorithms are introduced to optimize the star shaped grain, wheel shaped grain and wing cylindrical grain. After meeting the demands of burning surface changes and volumetric fraction, the optimum three dimensional shapes of grain were obtained. Finally, by means of parametric modeling functions, pressure data of SRM’s cold pressurization test was directly applied to simulation of grain in terms of mechanical performance. The results verify the reliability and practical of parameterized modeling program of SRM.

Keywords: cold pressurization test, ğarametric modeling, structural integrity, propellant grain, SRM

Procedia PDF Downloads 331
2495 Effectiveness of Computer-Based Cognitive Training in Improving Attention-Deficit/Hyperactivity Disorder Rehabilitation

Authors: Marjan Ghazisaeedi, Azadeh Bashiri

Abstract:

Background: Attention-Deficit/Hyperactivity Disorder(ADHD), is one of the most common psychiatric disorders in early childhood that in addition to its main symptoms provide significant deficits in the areas of educational, social and individual relationship. Considering the importance of rehabilitation in ADHD patients to control these problems, this study investigated the advantages of computer-based cognitive training in these patients. Methods: This review article has been conducted by searching articles since 2005 in scientific databases and e-Journals and by using keywords including computerized cognitive rehabilitation, computer-based training and ADHD. Results: Since drugs have short term effects and also they have many side effects in the rehabilitation of ADHD patients, using supplementary methods such as computer-based cognitive training is one of the best solutions. This approach has quick feedback and also has no side effects. So, it provides promising results in cognitive rehabilitation of ADHD especially on the working memory and attention. Conclusion: Considering different cognitive dysfunctions in ADHD patients, application of the computerized cognitive training has the potential to improve cognitive functions and consequently social, academic and behavioral performances in patients with this disorder.

Keywords: ADHD, computer-based cognitive training, cognitive functions, rehabilitation

Procedia PDF Downloads 253
2494 The Kidney-Spine Traffic System: Future Cities, Ensuring World Class Civic Amenities in Urban India

Authors: Abhishek Srivastava, Jeevesh Nandan, Manish Kumar

Abstract:

The study was taken to analyse the alternative source of traffic system for effective and more convenient traffic flow by reducing points of conflicts as well as angle of conflict and keeping in view to minimize the problem of unnecessarily long waiting time, delays, congestion, traffic jam and geometric delays due to intersection between circular and straight lanes. It is a twin kidney-spine type structure system with special allowance for Highway users for quicker passes. Thus reduction in number and intensity of accidents, significance reduction in traffic jam, conservation of valuable time.

Keywords: traffic system, collision reduction of vehicles, smooth flow of vehicles, traffic jam

Procedia PDF Downloads 397
2493 Avian Esophagus: A Comparative Microscopic Study In Birds With Different Feeding Habits

Authors: M. P. S. Tomar, Himanshu R. Joshi, P. Jagapathi Ramayya, Rakhi Vaish, A. B. Shrivastav

Abstract:

The morphology of an organ system varies according to the feeding habit, habitat and nature of their life-style. This phenomenon is called adaptation. During evolution these morphological changes make the system species specific so the study on the differential characteristics of them makes the understanding regarding the morpho-physiological adaptation easier. Hence the present study was conducted on esophagus of pariah kite, median egret, goshawk, dove and duck. Esophagus in all birds was comprised of four layers viz. Tunica mucosa, Tunica submucosa, Tunica muscularis and Tunica adventitia. The mucosa of esophagus showed longitudinal folds thus the lumen was irregular. The epithelium was stratified squamous in all birds but in Median egret the cells were large and vacuolated. Among these species very thick epithelium was observed in goshawk and duck but keratinization was highest in dove. The stratum spongiosum was 7-8 layers thick in both Pariah kite and Goshawk. In all birds, the glands were alveolar mucous secreting type. In Median egret and Pariah kite, these were round or oval in shape and with or without lumen depending upon the functional status whereas in Goshawk the shape of the glands varied from spherical / oval to triangular with openings towards the lumen according to the functional activity and in dove these glands were oval in shape. The glands were numerous in number in egret while one or two in each fold in goshawk and less numerous in other three species. The core of the mucosal folds was occupied by the lamina propria and showed large number of collagen fibers and cellular infiltration in pariah kite, egret and dove where as in goshawk and duck, collagen and reticular fibers were fewer and cellular infiltration was lesser. Lamina muscularis was very thick in all species and it was comprised of longitudinally arranged smooth muscle fibers. In Median egret, it was in wavy pattern. Tunica submucosa was very thin in all species. Tunica muscularis was mostly comprised of circular smooth muscle bundles in all species but the longitudinal bundles were very few in number and not continuous. The tunica adventitia was comprised of loose connective tissue fibers containing collagen and elastic fibers with numerous small blood vessels in all species. Further, it was observed that the structure of esophagus in birds varies according to their feeding habits.

Keywords: dove, duck, egret, esophagus, goshawk, kite

Procedia PDF Downloads 411
2492 Radiographic Evaluation of Odontogenic Keratocyst: A 14 Years Retrospective Study

Authors: Nor Hidayah Reduwan, Jira Chindasombatjaroen, Suchaya Pornprasersuk-Damrongsri, Sopee Pomsawat

Abstract:

INTRODUCTION: Odontogenic keratocyst (OKC) remain as a controversial pathologic entity under the scrutiny of many researchers and maxillofacial surgeons alike. The high recurrence rate and relatively aggressive nature of this lesion demand a meticulous analysis of the radiographic characteristic of OKC leading to the formulation of an accurate diagnosis. OBJECTIVE: This study aims to determine the radiographic characteristic of odontogenic keratocyst (OKC) using conventional radiographs and cone beam computed tomography (CBCT) images. MATERIALS AND METHODS: Patients histopathologically diagnosed as OKC from 2003 to 2016 by Oral and Maxillofacial Pathology Department were retrospectively reviewed. Radiographs of these cases from the archives of the Department of Oral and Maxillofacial Radiology, Faculty of Dentistry Mahidol University were retrieved. Assessment of the location, shape, border, cortication, locularity, the relationship of lesion to embedded tooth, displacement of adjacent tooth, root resorption and bony expansion of the lesion were conducted. RESULTS: Radiographs of 91 patients (44 males, 47 females) with the mean age of 31 years old (10 to 84 years) were analyzed. Among all patients, 5 cases were syndromic patients. Hence, a total of 103 OKCs were studied. The most common location was at the ramus of mandible (32%) followed by posterior maxilla (29%). Most cases presented as a well-defined unilocular radiolucency with smooth and corticated border. The lesion was in associated with embedded tooth in 48 lesions (47%). Eighty five percent of embedded tooth are impacted 3rd molar. Thirty-seven percentage of embedded tooth were entirely encapsulated in the lesion. The lesion attached to the embedded tooth at the cementoenamel junction (CEJ) in 40% and extended to part of root in 23% of cases. Teeth displacement and root resorption were found in 29% and 6% of cases, respectively. Bony expansion in bucco-lingual dimension was seen in 63% of cases. CONCLUSION: OKCs were predominant in the posterior region of the mandible with radiographic features of a well-defined, unilocular radiolucency with smooth and corticated margin. The lesions might relate to an embedded tooth by surrounding an entire tooth, attached to the CEJ level or extending to part of root. Bony expansion could be found but teeth displacement and root resorption were not common. These features might help in giving the differential diagnosis.

Keywords: cone beam computed tomography, imaging dentistry, odontogenic keratocyst, radiographic features

Procedia PDF Downloads 110
2491 Optimizing Mechanical Behavior of Middle Ear Prosthesis Using Finite Element Method with Material Degradation Functionally Graded Materials in Three Functions

Authors: Khatir Omar, Fekih Sidi Mohamed, Sahli Abderahmene, Benkhettou Abdelkader, Boudjemaa Ismail

Abstract:

Advancements in technology have revolutionized healthcare, with notable impacts on auditory health. This study introduces an approach aimed at optimizing materials for middle ear prostheses to enhance auditory performance. We have developed a finite element (FE) model of the ear incorporating a pure titanium TORP prosthesis, validated against experimental data. Subsequently, we applied the Functionally Graded Materials (FGM) methodology, utilizing linear, exponential, and logarithmic degradation functions to modify prosthesis materials. Biocompatible materials suitable for auditory prostheses, including Stainless Steel, titanium, and Hydroxyapatite, were investigated. The findings indicate that combinations such as Stainless Steel with titanium and Hydroxyapatite offer improved outcomes compared to pure titanium and Hydroxyapatite ceramic in terms of both displacement and stress. Additionally, personalized prostheses tailored to individual patient needs are feasible, underscoring the potential for further advancements in auditory healthcare.

Keywords: middle ear, prosthesis, ossicles, FGM, vibration analysis, finite-element method

Procedia PDF Downloads 37
2490 An Improved Method on Static Binary Analysis to Enhance the Context-Sensitive CFI

Authors: Qintao Shen, Lei Luo, Jun Ma, Jie Yu, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Control Flow Integrity (CFI) is one of the most promising technique to defend Code-Reuse Attacks (CRAs). Traditional CFI Systems and recent Context-Sensitive CFI use coarse control flow graphs (CFGs) to analyze whether the control flow hijack occurs, left vast space for attackers at indirect call-sites. Coarse CFGs make it difficult to decide which target to execute at indirect control-flow transfers, and weaken the existing CFI systems actually. It is an unsolved problem to extract CFGs precisely and perfectly from binaries now. In this paper, we present an algorithm to get a more precise CFG from binaries. Parameters are analyzed at indirect call-sites and functions firstly. By comparing counts of parameters prepared before call-sites and consumed by functions, targets of indirect calls are reduced. Then the control flow would be more constrained at indirect call-sites in runtime. Combined with CCFI, we implement our policy. Experimental results on some popular programs show that our approach is efficient. Further analysis show that it can mitigate COOP and other advanced attacks.

Keywords: contex-sensitive, CFI, binary analysis, code reuse attack

Procedia PDF Downloads 290
2489 Prediction of Terrorist Activities in Nigeria using Bayesian Neural Network with Heterogeneous Transfer Functions

Authors: Tayo P. Ogundunmade, Adedayo A. Adepoju

Abstract:

Terrorist attacks in liberal democracies bring about a few pessimistic results, for example, sabotaged public support in the governments they target, disturbing the peace of a protected environment underwritten by the state, and a limitation of individuals from adding to the advancement of the country, among others. Hence, seeking for techniques to understand the different factors involved in terrorism and how to deal with those factors in order to completely stop or reduce terrorist activities is the topmost priority of the government in every country. This research aim is to develop an efficient deep learning-based predictive model for the prediction of future terrorist activities in Nigeria, addressing low-quality prediction accuracy problems associated with the existing solution methods. The proposed predictive AI-based model as a counterterrorism tool will be useful by governments and law enforcement agencies to protect the lives of individuals in society and to improve the quality of life in general. A Heterogeneous Bayesian Neural Network (HETBNN) model was derived with Gaussian error normal distribution. Three primary transfer functions (HOTTFs), as well as two derived transfer functions (HETTFs) arising from the convolution of the HOTTFs, are namely; Symmetric Saturated Linear transfer function (SATLINS ), Hyperbolic Tangent transfer function (TANH), Hyperbolic Tangent sigmoid transfer function (TANSIG), Symmetric Saturated Linear and Hyperbolic Tangent transfer function (SATLINS-TANH) and Symmetric Saturated Linear and Hyperbolic Tangent Sigmoid transfer function (SATLINS-TANSIG). Data on the Terrorist activities in Nigeria gathered through questionnaires for the purpose of this study were used. Mean Square Error (MSE), Mean Absolute Error (MAE) and Test Error are the forecast prediction criteria. The results showed that the HETFs performed better in terms of prediction and factors associated with terrorist activities in Nigeria were determined. The proposed predictive deep learning-based model will be useful to governments and law enforcement agencies as an effective counterterrorism mechanism to understand the parameters of terrorism and to design strategies to deal with terrorism before an incident actually happens and potentially causes the loss of precious lives. The proposed predictive AI-based model will reduce the chances of terrorist activities and is particularly helpful for security agencies to predict future terrorist activities.

Keywords: activation functions, Bayesian neural network, mean square error, test error, terrorism

Procedia PDF Downloads 139
2488 A Review on Higher-Order Spline Techniques for Solving Burgers Equation Using B-Spline Methods and Variation of B-Spline Techniques

Authors: Maryam Khazaei Pool, Lori Lewis

Abstract:

This is a summary of articles based on higher order B-splines methods and the variation of B-spline methods such as Quadratic B-spline Finite Elements Method, Exponential Cubic B-Spline Method, Septic B-spline Technique, Quintic B-spline Galerkin Method, and B-spline Galerkin Method based on the Quadratic B-spline Galerkin method (QBGM) and Cubic B-spline Galerkin method (CBGM). In this paper, we study the B-spline methods and variations of B-spline techniques to find a numerical solution to the Burgers’ equation. A set of fundamental definitions, including Burgers equation, spline functions, and B-spline functions, are provided. For each method, the main technique is discussed as well as the discretization and stability analysis. A summary of the numerical results is provided, and the efficiency of each method presented is discussed. A general conclusion is provided where we look at a comparison between the computational results of all the presented schemes. We describe the effectiveness and advantages of these methods.

Keywords: Burgers’ equation, Septic B-spline, modified cubic B-spline differential quadrature method, exponential cubic B-spline technique, B-spline Galerkin method, quintic B-spline Galerkin method

Procedia PDF Downloads 95
2487 Thermodynamic Approach of Lanthanide-Iron Double Oxides Formation

Authors: Vera Varazashvili, Murman Tsarakhov, Tamar Mirianashvili, Teimuraz Pavlenishvili, Tengiz Machaladze, Mzia Khundadze

Abstract:

Standard Gibbs energy of formation ΔGfor(298.15) of lanthanide-iron double oxides of garnet-type crystal structure R3Fe5O12 - RIG (R – are rare earth ions) from initial oxides are evaluated. The calculation is based on the data of standard entropies S298.15 and standard enthalpies ΔH298.15 of formation of compounds which are involved in the process of garnets synthesis. Gibbs energy of formation is presented as temperature function ΔGfor(T) for the range 300-1600K. The necessary starting thermodynamic data were obtained from calorimetric study of heat capacity – temperature functions and by using the semi-empirical method for calculation of ΔH298.15 of formation. Thermodynamic functions for standard temperature – enthalpy, entropy and Gibbs energy - are recommended as reference data for technological evaluations. Through the isostructural series of rare earth-iron garnets the correlation between thermodynamic properties and characteristics of lanthanide ions are elucidated.

Keywords: calorimetry, entropy, enthalpy, heat capacity, gibbs energy of formation, rare earth iron garnets

Procedia PDF Downloads 357
2486 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 168
2485 A Future Urban Street Design in Baltimore, Maryland Based on a Hierarchy of Functional Needs and the Context of Autonomous Vehicles, Green Infrastructure, and Evolving Street Typologies

Authors: Samuel Quick

Abstract:

The purpose of this paper is to examine future urban street design in the context of developing technologies, evolving street typologies, and projected transportation trends. The goal was to envision a future urban street in the year 2060 that addresses the advent and implementation of autonomous vehicles, the promotion of new street typologies, and the projection of current transportation trends. Using a hierarchy of functional needs for urban streets, the future street was designed and evaluated based on the functions the street provides to the surrounding community. The site chosen for the future street design is an eight-block section of West North Avenue in the city of Baltimore, Maryland. Three different conceptual designs were initially completed and evaluated leading to a master plan for West North Avenue as well as street designs for connecting streets that represent different existing street types. Final designs were compared with the existing street design and evaluated with the adapted ‘Hierarchy of Needs’ theory. The review of the literature and the results from this paper indicate that urban streets will have to become increasingly multi-functional to meet the competing needs of the environment and community. Future streets will have to accommodate multimodal transit which will include mass transit, walking, and biking. Furthermore, a comprehensive implementation of green infrastructure within the urban street will provide access to nature for urban communities and essential stormwater management. With these developments, the future of an urban street will move closer to a greenway typology. Findings from this study indicate that urban street design will have to be policy-driven to promote and implement autonomous bus-rapid-transit in order to conserve street space for other functions. With this conservation of space, urban streets can then provide more functions to the surrounding community, taking a holistic approach to urban street design.

Keywords: autonomous vehicle, greenway, green infrastructure, multi-modality, street typology

Procedia PDF Downloads 154
2484 Green, Smooth and Easy Electrochemical Synthesis of N-Protected Indole Derivatives

Authors: Sarah Fahad Alajmi, Tamer Ezzat Youssef

Abstract:

Here, we report a simple method for the direct conversion of 6-Nitro-1H-indole into N-substituted indoles via electrochemical dehydrogenative reaction with halogenated reagents under strongly basic conditions through N–R bond formation. The N-protected indoles have been prepared under moderate and scalable electrolytic conditions. The conduct of the reactions was performed in a simple divided cell under constant current without oxidizing reagents or transition-metal catalysts. The synthesized products have been characterized via UV/Vis spectrophotometry, 1H-NMR, and FTIR spectroscopy. A possible reaction mechanism is discussed based on the N-protective products. This methodology could be applied to the synthesis of various biologically active N-substituted indole derivatives.

Keywords: green chemistry, 1H-indole, heteroaromatic, organic electrosynthesis

Procedia PDF Downloads 131
2483 Executive Functions Directly Associated with Severity of Perceived Pain above and beyond Depression in the Context of Medical Rehabilitation

Authors: O. Elkana, O Heyman, S. Hamdan, M. Franko, J. Vatine

Abstract:

Objective: To investigate whether a direct link exists between perceived pain (PP) and executive functions (EF), above and beyond the influence of depression symptoms, in the context of medical rehabilitation. Design: Cross-sectional study. Setting: Rehabilitation Hospital. Participants: 125 medical records of hospitalized patients were screened for matching to our inclusion criteria. Only 60 patients were found fit and were asked to participate. 19 decline to participate on personal basis. The 41 neurologically intact patients (mean age 46, SD 14.96) that participated in this study were in their sub-acute stage of recovery, with fluent Hebrew, with intact upper limb (to neutralize influence on psychomotor performances) and without an organic brain damage. Main Outcome Measures: EF were assessed using the Wisconsin Card Sorting Test (WCST) and the Stop-Signal Test (SST). PP was measured using 3 well-known pain questionnaires: Pain Disability Index (PDI), The Short-Form McGill Questionnaire (SF-MPQ) and the Pain Catastrophizing Scale (PCS). Perceived pain index (PPI) was calculated by the mean score composite from the 3 pain questionnaires. Depression symptoms were assessed using the Patient Health Questionnaire (PHQ-9). Results: The results indicate that irrespective of the presence of depression symptoms, PP is directly correlated with response inhibition (SST partial correlation: r=0.5; p=0.001) and mental flexibility (WSCT partial correlation: r=-0.37; p=0.021), suggesting decreased performance in EF as PP severity increases. High correlations were found between the 3 pain measurements: SF-MPQ with PDI (r=0.62, p<0.001), SF-MPQ with PCS (r=0.58, p<0.001) and PDI with PCS (r=0.38, p=0.016) and each questionnaire alone was also significantly associated with EF; thus, no specific questionnaires ‘pulled’ the results obtained by the general index (PPI). Conclusion: Examining the direct association between PP and EF, beyond the contribution of depression symptoms, provides further clinical evidence suggesting that EF and PP share underlying mediating neuronal mechanisms. Clinically, the importance of assessing patients' EF abilities as well as PP severity during rehabilitation is underscored.

Keywords: depression, executive functions, mental-flexibility, neuropsychology, pain perception, perceived pain, response inhibition

Procedia PDF Downloads 222