Search results for: adaptive and non-adaptive spectral estimation
1047 Jamun Juice Extraction Using Commercial Enzymes and Optimization of the Treatment with the Help of Physicochemical, Nutritional and Sensory Properties
Authors: Payel Ghosh, Rama Chandra Pradhan, Sabyasachi Mishra
Abstract:
Jamun (Syzygium cuminii L.) is one of the important indigenous minor fruit with high medicinal value. The jamun cultivation is unorganized and there is huge loss of this fruit every year. The perishable nature of the fruit makes its postharvest management further difficult. Due to the strong cell wall structure of pectin-protein bonds and hard seeds, extraction of juice becomes difficult. Enzymatic treatment has been commercially used for improvement of juice quality with high yield. The objective of the study was to optimize the best treatment method for juice extraction. Enzymes (Pectinase and Tannase) from different stains had been used and for each enzyme, best result obtained by using response surface methodology. Optimization had been done on the basis of physicochemical property, nutritional property, sensory quality and cost estimation. According to quality aspect, cost analysis and sensory evaluation, the optimizing enzymatic treatment was obtained by Pectinase from Aspergillus aculeatus strain. The optimum condition for the treatment was 44 oC with 80 minute with a concentration of 0.05% (w/w). At these conditions, 75% of yield with turbidity of 32.21NTU, clarity of 74.39%T, polyphenol content of 115.31 mg GAE/g, protein content of 102.43 mg/g have been obtained with a significant difference in overall acceptability.Keywords: enzymatic treatment, Jamun, optimization, physicochemical property, sensory analysis
Procedia PDF Downloads 2961046 O-LEACH: The Problem of Orphan Nodes in the LEACH of Routing Protocol for Wireless Sensor Networks
Authors: Wassim Jerbi, Abderrahmen Guermazi, Hafedh Trabelsi
Abstract:
The optimum use of coverage in wireless sensor networks (WSNs) is very important. LEACH protocol called Low Energy Adaptive Clustering Hierarchy, presents a hierarchical clustering algorithm for wireless sensor networks. LEACH is a protocol that allows the formation of distributed cluster. In each cluster, LEACH randomly selects some sensor nodes called cluster heads (CHs). The selection of CHs is made with a probabilistic calculation. It is supposed that each non-CH node joins a cluster and becomes a cluster member. Nevertheless, some CHs can be concentrated in a specific part of the network. Thus, several sensor nodes cannot reach any CH. to solve this problem. We created an O-LEACH Orphan nodes protocol, its role is to reduce the sensor nodes which do not belong the cluster. The cluster member called Gateway receives messages from neighboring orphan nodes. The gateway informs CH having the neighboring nodes that not belong to any group. However, Gateway called (CH') attaches the orphaned nodes to the cluster and then collected the data. O-Leach enables the formation of a new method of cluster, leads to a long life and minimal energy consumption. Orphan nodes possess enough energy and seeks to be covered by the network. The principal novel contribution of the proposed work is O-LEACH protocol which provides coverage of the whole network with a minimum number of orphaned nodes and has a very high connectivity rates.As a result, the WSN application receives data from the entire network including orphan nodes. The proper functioning of the Application requires, therefore, management of intelligent resources present within each the network sensor. The simulation results show that O-LEACH performs better than LEACH in terms of coverage, connectivity rate, energy and scalability.Keywords: WSNs; routing; LEACH; O-LEACH; Orphan nodes; sub-cluster; gateway; CH’
Procedia PDF Downloads 3711045 Modelling Home Appliances for Energy Management System: Comparison of Simulation Results with Measurements
Authors: Aulon Shabani, Denis Panxhi, Orion Zavalani
Abstract:
This paper presents the modelling and development of a simulator for residential electrical appliances. The simulator is developed on MATLAB providing the possibility to analyze and simulate energy consumption of frequently used home appliances in Albania. Modelling of devices considers the impact of different factors, mentioning occupant behavior and climacteric conditions. Most devices are modeled as an electric circuit, and the electric energy consumption is estimated by the solutions of the guiding differential equations. The provided models refer to devices like a dishwasher, oven, water heater, air conditioners, light bulbs, television, refrigerator water, and pump. The proposed model allows us to simulate beforehand the energetic behavior of the largest consumption home devices to estimate peak consumption and improving its reduction. Simulated home prototype results are compared to real measurement of a considered typical home. Obtained results from simulator framework compared to monitored typical household using EmonTxV3 show the effectiveness of the proposed simulation. This conclusion will help for future simulation of a large group of typical household for a better understanding of peak consumption.Keywords: electrical appliances, energy management, modelling, peak estimation, simulation, smart home
Procedia PDF Downloads 1641044 Whole Coding Genome Inter-Clade Comparisons to Predict Global Cancer-Protecting Variants
Authors: Lamis Naddaf, Yuval Tabach
Abstract:
We identified missense genetic variants with the potential to enhance resistance against cancer. Such a field has not been widely explored as researchers tend to investigate the mutations that cause diseases, in response to the suffering of patients, rather than those mutations that protect from them. In conjunction with the genomic revolution and the advances in genetic engineering and synthetic biology, identifying the protective variants will increase the power of genotype-phenotype predictions and have significant implications for improved risk estimation, diagnostics, prognosis, and even personalized therapy and drug discovery. To approach our goal, we systematically investigated the sites of the coding genomes and selected the alleles that showed a correlation with the species’ cancer resistance. Interestingly, we found several amino acids that are more generally preferred (like the Proline) or avoided (like the Cysteine) by the resistant species. Furthermore, Cancer resistance in mammals and reptiles is significantly predicted by the number of the predicted protecting variants (PVs) a species has. Moreover, PVs-enriched-genes are enriched in pathways relevant to tumor suppression. For example, they are enriched in the Hedgehog signaling and silencing pathways, which its improper activation is associated with the most common form of cancer malignancy. We also showed that the PVs are mostly more abundant in healthy people compared to cancer patients within different human races.Keywords: cancer resistance, protecting variant, naked mole rat, comparative genomics
Procedia PDF Downloads 1111043 Estimation of Lungs Physiological Motion for Patient Undergoing External Lung Irradiation
Authors: Yousif Mohamed Y. Abdallah
Abstract:
This is an experimental study deals with detection, measurement and analysis of the periodic physiological organ motion during external beam radiotherapy; to improve the accuracy of the radiation field placement, and to reduce the exposure of healthy tissue during radiation treatments. The importance of this study is to detect the maximum path of the mobile structures during radiotherapy delivery, to define the planning target volume (PTV) and irradiated volume during both inspiration and expiration period and to verify the target volume. In addition to its role to highlight the importance of the application of Intense Guided Radiotherapy (IGRT) methods in the field of radiotherapy. The results showed (body contour was equally (3.17 + 0.23 mm), for left lung displacement reading (2.56 + 0.99 mm) and right lung is (2.42 + 0.77 mm) which the radiation oncologist to take suitable countermeasures in case of significant errors. In addition, the use of the image registration technique for automatic position control is predicted potential motion. The motion ranged between 2.13 mm and 12.2 mm (low and high). In conclusion, individualized assessment of tumor mobility can improve the accuracy of target areas definition in patients undergo Sterostatic RT for stage I, II and III lung cancer (NSCLC). Definition of the target volume based on a single CT scan with a margin of 10 mm is clearly inappropriate.Keywords: respiratory motion, external beam radiotherapy, image processing, lung
Procedia PDF Downloads 5361042 Factors Affecting the Profitability of Commercial Banks: An Empirical Study of Indian Banking Sector
Authors: Neeraj Gupta, Jitendra Mahakud
Abstract:
The banking system plays a major role in the Indian economy. Banking system is the payment gateway of most of the financial transactions. Banking has gone a major transition that is still in progress. Recent banking reforms after liberalization in 1991 have led to the establishment of the foreign banks in the country. The foreign banks are not listed in the Indian stock markets and have increased the competition leading to the capture of the significant share in the revenue from the public sector banks which are still the major players in the Indian banking sector. The performance of the banking sector depends on the internal (bank specific) as well as the external (market specific and macroeconomic) factors. Profitability in banking sector is affected by numerous factors which can be internal or external. The present study examines these internal and external factors which are likely to effect the profitablilty of the Indian banks. The sample consists of a panel dataset of 64 commercial banks in India, consisting of 1088 observations over the years from 1998 to 2016. The GMM dynamic panel estimation given by Arellano and Bond has been used. The study revealed that the variables capital adequacy ratio, deposit, age, labour productivity, non-performing asset, inflation and concentration have significant effect on performance measured.Keywords: banks in India, bank performance, bank productivity, banking management
Procedia PDF Downloads 2721041 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand
Authors: Esma Birisci, Ronald McGarvey
Abstract:
One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.Keywords: environmental studies, food waste, production planning, uncertain and correlated demand
Procedia PDF Downloads 3721040 2106 kA/cm² Peak Tunneling Current Density in GaN-Based Resonant Tunneling Diode with an Intrinsic Oscillation Frequency of ~260GHz at Room Temperature
Authors: Fang Liu, JunShuai Xue, JiaJia Yao, GuanLin Wu, ZuMaoLi, XueYan Yang, HePeng Zhang, ZhiPeng Sun
Abstract:
Terahertz spectra is in great demand since last two decades for many photonic and electronic applications. III-Nitride resonant tunneling diode is one of the promising candidates for portable and compact THz sources. Room temperature microwave oscillator based on GaN/AlN resonant tunneling diode was reported in this work. The devices, grown by plasma-assisted molecular-beam epitaxy on free-standing c-plane GaN substrates, exhibit highly repeatable and robust negative differential resistance (NDR) characteristics at room temperature. To improve the interface quality at the active region in RTD, indium surfactant assisted growth is adopted to enhance the surface mobility of metal atoms on growing film front. Thanks to the lowered valley current associated with the suppression of threading dislocation scattering on low dislocation GaN substrate, a positive peak current density of record-high 2.1 MA/cm2 in conjunction with a peak-to-valley current ratio (PVCR) of 1.2 are obtained, which is the best results reported in nitride-based RTDs up to now considering the peak current density and PVCR values simultaneously. When biased within the NDR region, microwave oscillations are measured with a fundamental frequency of 0.31 GHz, yielding an output power of 5.37 µW. Impedance mismatch results in the limited output power and oscillation frequency described above. The actual measured intrinsic capacitance is only 30fF. Using a small-signal equivalent circuit model, the maximum intrinsic frequency of oscillation for these diodes is estimated to be ~260GHz. This work demonstrates a microwave oscillator based on resonant tunneling effect, which can meet the demands of terahertz spectral devices, more importantly providing guidance for the fabrication of the complex nitride terahertz and quantum effect devices.Keywords: GaN resonant tunneling diode, peak current density, microwave oscillation, intrinsic capacitance
Procedia PDF Downloads 1391039 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data
Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini
Abstract:
A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.Keywords: central Italy, extreme events, rainfall data, underestimation errors
Procedia PDF Downloads 1911038 Screening Diversity: Artificial Intelligence and Virtual Reality Strategies for Elevating Endangered African Languages in the Film and Television Industry
Authors: Samuel Ntsanwisi
Abstract:
This study investigates the transformative role of Artificial Intelligence (AI) and Virtual Reality (VR) in the preservation of endangered African languages. The study is contextualized within the film and television industry, highlighting disparities in screen representation for certain languages in South Africa, underscoring the need for increased visibility and preservation efforts; with globalization and cultural shifts posing significant threats to linguistic diversity, this research explores approaches to language preservation. By leveraging AI technologies, such as speech recognition, translation, and adaptive learning applications, and integrating VR for immersive and interactive experiences, the study aims to create a framework for teaching and passing on endangered African languages. Through digital documentation, interactive language learning applications, storytelling, and community engagement, the research demonstrates how these technologies can empower communities to revitalize their linguistic heritage. This study employs a dual-method approach, combining a rigorous literature review to analyse existing research on the convergence of AI, VR, and language preservation with primary data collection through interviews and surveys with ten filmmakers. The literature review establishes a solid foundation for understanding the current landscape, while interviews with filmmakers provide crucial real-world insights, enriching the study's depth. This balanced methodology ensures a comprehensive exploration of the intersection between AI, VR, and language preservation, offering both theoretical insights and practical perspectives from industry professionals.Keywords: language preservation, endangered languages, artificial intelligence, virtual reality, interactive learning
Procedia PDF Downloads 611037 Qualitative and Quantitative Traits of Processed Farmed Fish in N. W. Greece
Authors: Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou
Abstract:
The filleting yield and the chemical composition of farmed sea bass (Dicentrarchus labrax); rainbow trout (Oncorynchus mykiss) and meagre (Argyrosomus regius) was investigated in farmed fish in NW Greece. The results provide an estimate of the quantity of fish required to produce one kilogram of fillet weight, an estimation which is required for the operational management of fish processing companies. Furthermore in this work, the ratio of feed input required to produce one kilogram of fish fillet (FFCR) is presented for the first time as a useful indicator of the ecological footprint of consuming farmed fish. The lowest lipid content appeared in meagre (1,7%) and the highest in trout (4,91%). The lowest fillet yield and fillet yield feed conversion ratio (FYFCR) was in meagre (FY=42,17%, FFCR=2,48), the best fillet yield (FY=53,8%) and FYFCR (2,10) was exhibited in farmed rainbow trout. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.Keywords: farmed fish, flesh quality, filleting yield, lipid
Procedia PDF Downloads 3091036 Quantitative and Fourier Transform Infrared Analysis of Saponins from Three Kenyan Ruellia Species: Ruellia prostrata, Ruellia lineari-bracteolata and Ruellia bignoniiflora
Authors: Christine O. Wangia, Jennifer A. Orwa, Francis W. Muregi, Patrick G. Kareru, Kipyegon Cheruiyot, Eric Guantai
Abstract:
Ruellia (syn. Dipteracanthus) species are wild perennial creepers belonging to the Acanthaceae family. These species are reported to possess anti-inflammatory, analgesic, antioxidant, gastroprotective, anticancer, and immuno-stimulant properties. Phytochemical screening of both aqueous and methanolic extracts of Ruellia species revealed the presence of saponins. Saponins have been reported to possess anti-inflammatory, antioxidant, immuno-stimulant, antihepatotoxic, antibacterial, anticarcinogenic, and antiulcerogenic activities. The objective of this study was to quantify and analyze the Fourier transform infrared (FTIR) spectra of saponins in crude extracts of three Kenyan Ruellia species namely Ruellia prostrata (RPM), Ruellia lineari-bracteolata (RLB) and Ruellia bignoniiflora (RBK). Sequential organic extraction of the ground whole plant material was done using petroleum ether (PE), chloroform, ethyl acetate (EtOAc), and absolute methanol by cold maceration, while aqueous extraction was by hot maceration. The plant powders and extracts were mixed with spectroscopic grade KBr and compressed into a pellet. The infrared spectra were recorded using a Shimadzu FTIR spectrophotometer of 8000 series in the range of 3500 cm-1 - 500 cm-1. Quantitative determination of the saponins was done using standard procedures. Quantitative analysis of saponins showed that RPM had the highest quantity of crude saponins (2.05% ± 0.03), followed by RLB (1.4% ± 0.15) and RBK (1.25% ± 0.11), respectively. FTIR spectra revealed the spectral peaks characteristic for saponins in RPM, RLB, and RBK plant powders, aqueous and methanol extracts; O-H absorption (3265 - 3393 cm-1), C-H absorption ranging from 2851 to 2924 cm-1, C=C absorbance (1628 - 1655 cm-1), oligosaccharide linkage (C-O-C) absorption due to sapogenins (1036 - 1042 cm-1). The crude saponins from RPM, RLB and RBK showed similar peaks to their respective extracts. The presence of the saponins in extracts of RPM, RLB and RBK may be responsible for some of the biological activities reported in the Ruellia species.1Keywords: Ruellia bignoniiflora, Ruellia linearibracteolata, Ruellia prostrata, Saponins
Procedia PDF Downloads 1811035 Observation of a Phase Transition in Adsorbed Hydrogen at 101 Kelvin
Authors: Raina J. Olsen, Andrew K. Gillespie, John W. Taylor, Cristian I. Contescu, Peter Pfeifer, James R. Morris
Abstract:
While adsorbent surfaces such as graphite are known to increase the melting temperature of solid H2, this effect is normally rather small, increasing to 20 Kelvin (K) relative to 14 K in the bulk. An as-yet unidentified phase transition has been observed in a system of H2 adsorbed in a porous, locally graphitic, Saran carbon with sub-nanometer sized pores at temperatures (74-101 K) and pressures ( > 76 bar) well above the critical point of bulk H2 using hydrogen adsorption and neutron scattering experiments. Adsorption data shows a discontinuous pressure jump in the kinetics at 76 bar after nearly an hour of equilibration time, which is identified as an exothermic phase transition. This discontinuity is observed in the 87 K isotherm, but not the 77 K isotherm. At higher pressures, the measured isotherms show greater excess adsorption at 87 K than 77 K. Inelastic neutron scattering measurements also show a striking phase transition, with the amount of high angle scattering (corresponding to large momentum transfer/ large effective mass) increasing by up to a factor of 5 in the novel phase. During the course of the neutron scattering experiment, three of these reversible spectral phase transitions were observed to occur in response to only changes in sample temperature. The novel phase was observed by neutron scattering only at high H2 pressure (123 bar and 187 bar) and temperatures between 74-101 K in the sample of interest, but not at low pressure (30 bar), or in a control activated carbon at 186 bar of H2 pressure. Based on several of the more unusual observations, such as the slow equilibration and the presence of both an upper and lower temperature bound, a reasonable hypothesis is that this phase forms only in the presence of a high concentration of ortho-H2 (nuclear spin S=1). The increase in adsorption with temperature, temperatures which cross the lower temperature bound observed by neutron scattering, indicates that this novel phase is denser. Structural characterization data on the adsorbent shows that it may support a commensurate solid phase denser than those known to exist on graphite at much lower temperatures. Whatever this phase is eventually proven to be, these results show that surfaces can have a more striking effect on hydrogen phases than previously thought.Keywords: adsorbed phases, hydrogen, neutron scattering, nuclear spin
Procedia PDF Downloads 4661034 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building
Authors: Abdul Hakim Chikho
Abstract:
Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.Keywords: earthquake, fundamental mode period, design, building
Procedia PDF Downloads 2841033 Bioactive Secondary Metabolites from Culturable Unusual Actinomycetes from Solomon Islands Marine Sediments: Isolation and Characterisation of Bioactive Compounds
Authors: Ahilya Singh, Brad Carte, Ramesh Subramani, William Aalbersberg
Abstract:
A total of 37 actinomycete strains were purified from 25 Solomon Islands marine sediments using four different types of isolation media. Among them, 54% of the strains had obligate requirement of seawater for growth. The ethyl acetate extract of 100 ml fermentation product of each strain was screened for antimicrobial activity against multidrug resistant human pathogens and cytotoxic activity against brine shrimps. A total of 67% of the ethyl acetate extracts showed antimicrobial and/or cytotoxic activities. A strain F-1915 was selected for isolation and evaluation of bioactive compound(s) based on its bioactive properties and chemical profile analysis using the LC-MS. The strain F-1915 was identified to have 96% sequence similarity to Streptomyces violaceusniger on the basis of 16S rDNA sequences using BLAST analysis. The 16S rDNA revealed that the strain F-1915 is a new member of MAR4 clade of actinomycetes. The MAR4 clade is an interesting clade of actinomycetes known for the production of pharmaceutically important hybrid isoprenoid compounds. The ethyl acetate extract of the fermentation product of this strain was purified by silica gel column chromatography and afforded the isolation of one bioactive pure compound. Based on the 1D and 2D NMR spectral data of compound 1 it was identified as a new mono-brominated phenazinone, Marinophenazimycin A, a structure which has already been studied by external collaborators at Scripps Institution of Oceanography but is yet to be published. Compound 1 displayed significant antimicrobial activity against drug resistant human pathogens. The minimum inhibitory concentration (MIC) of compound 1 was against Methicillin Resistant Staphylococcus aureus (MRSA) was about 1.9 μg/ml and MIC recorded against Amphotericin Resistant Candida albicans (ARCA) was about 0.24 μg/ml. The bioactivity of compound 1 against ARCA was found to be better than the standard antifungal agent amphotericin B. Compound 1 however did not show any cytotoxic activity against brine shrimps.Keywords: actinomycetes, antimicrobial activity, brominated phenazine, MAR4 clade, marine natural products, multidrug resistent, 1D and 2D NMR
Procedia PDF Downloads 3381032 Verification of Simulated Accumulated Precipitation
Authors: Nato Kutaladze, George Mikuchadze, Giorgi Sokhadze
Abstract:
Precipitation forecasts are one of the most demanding applications in numerical weather prediction (NWP). Georgia, as the whole Caucasian region, is characterized by very complex topography. The country territory is prone to flash floods and mudflows, quantitative precipitation estimation (QPE) and quantitative precipitation forecast (QPF) at any leading time are very important for Georgia. In this study, advanced research weather forecasting model’s skill in QPF is investigated over Georgia’s territory. We have analyzed several convection parameterization and microphysical scheme combinations for different rainy episodes and heavy rainy phenomena. We estimate errors and biases in accumulated 6 h precipitation using different spatial resolution during model performance verification for 12-hour and 24-hour lead time against corresponding rain gouge observations and satellite data. Various statistical parameters have been calculated for the 8-month comparison period, and some skills of model simulation have been evaluated. Our focus is on the formation and organization of convective precipitation systems in a low-mountain region. Several problems in connection with QPF have been identified for mountain regions, which include the overestimation and underestimation of precipitation on the windward and lee side of the mountains, respectively, and a phase error in the diurnal cycle of precipitation leading to the onset of convective precipitation in model forecasts several hours too early.Keywords: extremal dependence index, false alarm, numerical weather prediction, quantitative precipitation forecasting
Procedia PDF Downloads 1471031 The Effect of Manual Acupuncture-induced Injury as a Mechanism Contributing to Muscle Regeneration
Authors: Kamal Ameis
Abstract:
This study aims to further improve our understanding of the underlying mechanism of local injury that occurs after manual acupuncture needle manipulation, and that initiates the muscle regeneration process, which is essential for muscle maintenance and adaptation. Skeletal muscle is maintained by resident stem cells called muscle satellite cells. These cells are normally in quiescent state, but following muscle injury, they re-enter the cell cycle and execute a myogenic program resulting in muscle fiber regeneration. Our previous work in young rats demonstrated that acupuncture treatment induced injury that activated resident satellite (stem) cells, which leads to muscle regeneration. Skeletal muscle regeneration is an adaptive response to injury that requires a tightly orchestrated event between signaling pathways activated by growth factor and intrinsic regulatory program controlled by myogenic transcription factor. We identified several gene expressions uniquely important for muscle regeneration in response to acupuncture treatment at different time course using different biological techniques, including Immunocytochemistry, western blotting, and Real Time PCR. This study uses a novel but non-invasive model of injury induced by manual acupuncture to further our current understanding of regenerative mechanism of muscle stem cells. From a clinical perspective, this model of injury induced by manual acupuncture may be easily translatable into a clinical tool that can be used as an alternative to physical exercise for patients challenged by bed rest or forced inactivity. Finally, the knowledge gained from this research could be useful for studies of the local effects of various modalities of induced injury, such as the traditional method of healing by cupping (hijamah), which may enhanced muscle stem cells and muscle fiber regeneration.Keywords: acupuncture, injury, regeneration, muscle stem cells
Procedia PDF Downloads 1481030 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution
Authors: Saleem Z. Ramadan
Abstract:
This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the PTH percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.Keywords: reliability, accelerated life testing, cumulative exposure model, Bayesian estimation, progressive type-I censoring, Weibull distribution
Procedia PDF Downloads 5051029 Compression Index Estimation by Water Content and Liquid Limit and Void Ratio Using Statistics Method
Authors: Lizhou Chen, Abdelhamid Belgaid, Assem Elsayed, Xiaoming Yang
Abstract:
Compression index is essential in foundation settlement calculation. The traditional method for determining compression index is consolidation test which is expensive and time consuming. Many researchers have used regression methods to develop empirical equations for predicting compression index from soil properties. Based on a large number of compression index data collected from consolidation tests, the accuracy of some popularly empirical equations were assessed. It was found that primary compression index is significantly overestimated in some equations while it is underestimated in others. The sensitivity analyses of soil parameters including water content, liquid limit and void ratio were performed. The results indicate that the compression index obtained from void ratio is most accurate. The ANOVA (analysis of variance) demonstrates that the equations with multiple soil parameters cannot provide better predictions than the equations with single soil parameter. In other words, it is not necessary to develop the relationships between compression index and multiple soil parameters. Meanwhile, it was noted that secondary compression index is approximately 0.7-5.0% of primary compression index with an average of 2.0%. In the end, the proposed prediction equations using power regression technique were provided that can provide more accurate predictions than those from existing equations.Keywords: compression index, clay, settlement, consolidation, secondary compression index, soil parameter
Procedia PDF Downloads 1631028 The Cartometric-Geographical Analysis of Ivane Javakhishvili 1922: The Map of the Republic of Georgia
Authors: Manana Kvetenadze, Dali Nikolaishvili
Abstract:
The study revealed the territorial changes of Georgia before the Soviet and Post-Soviet periods. This includes the estimation of the country's borders, its administrative-territorial arrangement change as well as the establishment of territorial losses. Georgia’s old and new borders marked on the map are of great interest. The new boundary shows the condition of 1922 year, following the Soviet period. Neither on this map nor in other works Ivane Javakhishvili talks about what he implies in the old borders, though it is evident that this is the Pre-Soviet boundary until 1921 – i.e., before the period when historical Tao, Zaqatala, Lore, Karaia represented the parts of Georgia. According to cartometric-geographical terms, the work presents detailed analysis of Georgia’s borders, along with this the comparison of research results has been carried out: 1) At the boundary line on Soviet topographic maps, the maps of 100,000; 50,000 and 25,000 scales are used; 2) According to Ivane Javakhishvili’s work ('The borders of Georgia in terms of historical and contemporary issues'). During that research, we used multi-disciplined methodology and software. We used Arc GIS for Georeferencing maps, and after that, we compare all post-Soviet Union maps, in order to determine how the borders have changed. During this work, we also use many historical data. The features of the spatial distribution of the territorial administrative units of Georgia, as well as the distribution of administrative-territorial units of the objects depicted on the map, have been established. The results obtained are presented in the forms of thematic maps and diagrams.Keywords: border, GIS, georgia, historical cartography, old maps
Procedia PDF Downloads 2421027 Identification Strategies for Unknown Victims from Mass Disasters and Unknown Perpetrators from Violent Crime or Terrorist Attacks
Authors: Michael Josef Schwerer
Abstract:
Background: The identification of unknown victims from mass disasters, violent crimes, or terrorist attacks is frequently facilitated through information from missing persons lists, portrait photos, old or recent pictures showing unique characteristics of a person such as scars or tattoos, or simply reference samples from blood relatives for DNA analysis. In contrast, the identification or at least the characterization of an unknown perpetrator from criminal or terrorist actions remains challenging, particularly in the absence of material or data for comparison, such as fingerprints, which had been previously stored in criminal records. In scenarios that result in high levels of destruction of the perpetrator’s corpse, for instance, blast or fire events, the chance for a positive identification using standard techniques is further impaired. Objectives: This study shows the forensic genetic procedures in the Legal Medicine Service of the German Air Force for the identification of unknown individuals, including such cases in which reference samples are not available. Scenarios requiring such efforts predominantly involve aircraft crash investigations, which are routinely carried out by the German Air Force Centre of Aerospace Medicine as one of the Institution’s essential missions. Further, casework by military police or military intelligence is supported based on administrative cooperation. In the talk, data from study projects, as well as examples from real casework, will be demonstrated and discussed with the audience. Methods: Forensic genetic identification in our laboratories involves the analysis of Short Tandem Repeats and Single Nucleotide Polymorphisms in nuclear DNA along with mitochondrial DNA haplotyping. Extended DNA analysis involves phenotypic markers for skin, hair, and eye color together with the investigation of a person’s biogeographic ancestry. Assessment of the biological age of an individual employs CpG-island methylation analysis using bisulfite-converted DNA. Forensic Investigative Genealogy assessment allows the detection of an unknown person’s blood relatives in reference databases. Technically, end-point-PCR, real-time PCR, capillary electrophoresis, pyrosequencing as well as next generation sequencing using flow-cell-based and chip-based systems are used. Results and Discussion: Optimization of DNA extraction from various sources, including difficult matrixes like formalin-fixed, paraffin-embedded tissues, degraded specimens from decomposed bodies or from decedents exposed to blast or fire events, provides soil for successful PCR amplification and subsequent genetic profiling. For cases with extremely low yields of extracted DNA, whole genome preamplification protocols are successfully used, particularly regarding genetic phenotyping. Improved primer design for CpG-methylation analysis, together with validated sampling strategies for the analyzed substrates from, e.g., lymphocyte-rich organs, allows successful biological age estimation even in bodies with highly degraded tissue material. Conclusions: Successful identification of unknown individuals or at least their phenotypic characterization using pigmentation markers together with age-informative methylation profiles, possibly supplemented by family tree search employing Forensic Investigative Genealogy, can be provided in specialized laboratories. However, standard laboratory procedures must be adapted to work with difficult and highly degraded sample materials.Keywords: identification, forensic genetics, phenotypic markers, CPG methylation, biological age estimation, forensic investigative genealogy
Procedia PDF Downloads 511026 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles
Authors: Jafar Mortadha, Imran Qureshi
Abstract:
This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes
Procedia PDF Downloads 2951025 Well-Being Inequality Using Superimposing Satisfaction Waves: Heisenberg Uncertainty in Behavioral Economics and Econometrics
Authors: Okay Gunes
Abstract:
In this article, for the first time in the literature for this subject we propose a new method for the measuring of well-being inequality through a model composed of superimposing satisfaction waves. The displacement of households’ satisfactory state (i.e. satisfaction) is defined in a satisfaction string. The duration of the satisfactory state for a given period of time is measured in order to determine the relationship between utility and total satisfactory time, itself dependent on the density and tension of each satisfaction string. Thus, individual cardinal total satisfaction values are computed by way of a one-dimensional form for scalar sinusoidal (harmonic) moving wave function, using satisfaction waves with varying amplitudes and frequencies which allow us to measure well-being inequality. One advantage to using satisfaction waves is the ability to show that individual utility and consumption amounts would probably not commute; hence it is impossible to measure or to know simultaneously the values of these observables from the dataset. Thus, we crystallize the problem by using a Heisenberg-type uncertainty resolution for self-adjoint economic operators. We propose to eliminate any estimation bias by correlating the standard deviations of selected economic operators; this is achieved by replacing the aforementioned observed uncertainties with households’ perceived uncertainties (i.e. corrected standard deviations) obtained through the logarithmic psychophysical law proposed by Weber and Fechner.Keywords: Heisenberg uncertainty principle, superimposing satisfaction waves, Weber–Fechner law, well-being inequality
Procedia PDF Downloads 4411024 Simulation of Improving the Efficiency of a Fire-Tube Steam Boiler
Authors: Roudane Mohamed
Abstract:
In this study we are interested in improving the efficiency of a steam boiler to 4.5T/h and minimize fume discharge temperature by the addition of a heat exchanger against the current in the energy system, the output of the boiler. The mathematical approach to the problem is based on the use of heat transfer by convection and conduction equations. These equations have been chosen because of their extensive use in a wide range of application. A software and developed for solving the equations governing these phenomena and the estimation of the thermal characteristics of boiler through the study of the thermal characteristics of the heat exchanger by both LMTD and NUT methods. Subsequently, an analysis of the thermal performance of the steam boiler by studying the influence of different operating parameters on heat flux densities, temperatures, exchanged power and performance was carried out. The study showed that the behavior of the boiler is largely influenced. In the first regime (P = 3.5 bar), the boiler efficiency has improved significantly from 93.03 to 99.43 at the rate of 6.47% and 4.5%. For maximum speed, the change is less important, it is of the order of 1.06%. The results obtained in this study of great interest to industrial utilities equipped with smoke tube boilers for the preheating air temperature intervene to calculate the actual temperature of the gas so the heat exchanged will be increased and minimize temperature smoke discharge. On the other hand, this work could be used as a model of computation in the design process.Keywords: numerical simulation, efficiency, fire tube, heat exchanger, convection and conduction
Procedia PDF Downloads 2181023 Sidelobe Free Inverse Synthetic Aperture Radar Imaging of Non Cooperative Moving Targets Using WiFi
Authors: Jiamin Huang, Shuliang Gui, Zengshan Tian, Fei Yan, Xiaodong Wu
Abstract:
In recent years, with the rapid development of radio frequency technology, the differences between radar sensing and wireless communication in terms of receiving and sending channels, signal processing, data management and control are gradually shrinking. There has been a trend of integrated communication radar sensing. However, most of the existing radar imaging technologies based on communication signals are combined with synthetic aperture radar (SAR) imaging, which does not conform to the practical application case of the integration of communication and radar. Therefore, in this paper proposes a high-precision imaging method using communication signals based on the imaging mechanism of inverse synthetic aperture radar (ISAR) imaging. This method makes full use of the structural characteristics of the orthogonal frequency division multiplexing (OFDM) signal, so the sidelobe effect in distance compression is removed and combines radon transform and Fractional Fourier Transform (FrFT) parameter estimation methods to achieve ISAR imaging of non-cooperative targets. The simulation experiment and measured results verify the feasibility and effectiveness of the method, and prove its broad application prospects in the field of intelligent transportation.Keywords: integration of communication and radar, OFDM, radon, FrFT, ISAR
Procedia PDF Downloads 1261022 TessPy – Spatial Tessellation Made Easy
Authors: Jonas Hamann, Siavash Saki, Tobias Hagen
Abstract:
Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies
Procedia PDF Downloads 1281021 The Characteristics of Quantity Operation for 2nd and 3rd Grade Mathematics Slow Learners
Authors: Pi-Hsia Hung
Abstract:
The development of mathematical competency has individual benefits as well as benefits to the wider society. Children who begin school behind their peers in their understanding of number, counting, and simple arithmetic are at high risk of staying behind throughout their schooling. The development of effective strategies for improving the educational trajectory of these individuals will be contingent on identifying areas of early quantitative knowledge that influence later mathematics achievement. A computer-based quantity assessment was developed in this study to investigate the characteristics of 2nd and 3rd grade slow learners in quantity. The concept of quantification involves understanding measurements, counts, magnitudes, units, indicators, relative size, and numerical trends and patterns. Fifty-five tasks of quantitative reasoning—such as number sense, mental calculation, estimation and assessment of reasonableness of results—are included as quantity problem solving. Thus, quantity is defined in this study as applying knowledge of number and number operations in a wide variety of authentic settings. Around 1000 students were tested and categorized into 4 different performance levels. Students’ quantity ability correlated higher with their school math grade than other subjects. Around 20% students are below basic level. The intervention design implications of the preliminary item map constructed are discussed.Keywords: mathematics assessment, mathematical cognition, quantity, number sense, validity
Procedia PDF Downloads 2471020 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler
Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury
Abstract:
An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler
Procedia PDF Downloads 1501019 Image Processing techniques for Surveillance in Outdoor Environment
Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.
Abstract:
This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management
Procedia PDF Downloads 261018 MAGNI Dynamics: A Vision-Based Kinematic and Dynamic Upper-Limb Model for Intelligent Robotic Rehabilitation
Authors: Alexandros Lioulemes, Michail Theofanidis, Varun Kanal, Konstantinos Tsiakas, Maher Abujelala, Chris Collander, William B. Townsend, Angie Boisselle, Fillia Makedon
Abstract:
This paper presents a home-based robot-rehabilitation instrument, called ”MAGNI Dynamics”, that utilized a vision-based kinematic/dynamic module and an adaptive haptic feedback controller. The system is expected to provide personalized rehabilitation by adjusting its resistive and supportive behavior according to a fuzzy intelligence controller that acts as an inference system, which correlates the user’s performance to different stiffness factors. The vision module uses the Kinect’s skeletal tracking to monitor the user’s effort in an unobtrusive and safe way, by estimating the torque that affects the user’s arm. The system’s torque estimations are justified by capturing electromyographic data from primitive hand motions (Shoulder Abduction and Shoulder Forward Flexion). Moreover, we present and analyze how the Barrett WAM generates a force-field with a haptic controller to support or challenge the users. Experiments show that by shifting the proportional value, that corresponds to different stiffness factors of the haptic path, can potentially help the user to improve his/her motor skills. Finally, potential areas for future research are discussed, that address how a rehabilitation robotic framework may include multisensing data, to improve the user’s recovery process.Keywords: human-robot interaction, kinect, kinematics, dynamics, haptic control, rehabilitation robotics, artificial intelligence
Procedia PDF Downloads 329