Search results for: action based method
37815 Operative Tips of Strattice Based Breast Reconstruction
Authors: Cho Ee Ng, Hazem Khout, Tarannum Fasih
Abstract:
Acellular dermal matrices are increasingly used to reinforce the lower pole of the breast during implant breast reconstruction. There is no standard technique described in literature for the use of this product. In this article, we share our operative method of fixation.Keywords: strattice, acellular dermal matric, breast reconstruction, implant
Procedia PDF Downloads 39637814 Improving Performance of K₂CO₃ Sorbent Using Core/Shell Alumina-Based Supports in a Multicycle CO₂ Capture Process
Authors: S. Toufigh Bararpour, Amir H. Soleimanisalim, Davood Karami, Nader Mahinpey
Abstract:
The continued increase in the atmospheric concentration of CO2 is expected to have great impacts on the climate. In order to reduce CO2 emission to the atmosphere, an efficient and cost-effective technique is required. Using regenerable solid sorbents, especially K2CO3 is a promising method for low-temperature CO2 capture. Pure K2CO3 is a delinquent substance that requires modifications before it can be used for cyclic operations. For this purpose, various types of additives and supports have been used to improve the structure of K2CO3. However, hydrophilicity and reactivity of the support materials with K2CO3 have a negative effect on the CO2 capture capacity of the sorbents. In this research, two kinds of alumina supports (γ-Alumina and Boehmite) were used. In order to decrease the supports' hydrophilicity and reactivity with K2CO3, nonreactive additives such as Titania, Zirconia and Silisium were incorporated into their structures. These materials provide a shell around the alumina to protect it from undesirable reactions and improve its properties. K2CO3-based core/shell-supported sorbents were fabricated using two preparation steps. The sol-gel method was applied for shelling the supports. Then the shelled supports were impregnated on K2CO3. The physicochemical properties of the sorbents were determined using SEM and BET analyses, and their CO2 capture capacity was quantified using a thermogravimetric analyzer. It was shown that type of the shell's material had an important effect on the water adsorption capacity of the sorbents. Supported K2CO3 modified by Titania shell showed the lowest hydrophilicity among the prepared samples. Based on the obtained results, incorporating nonreactive additives in Boehmite had an outstanding impact on the CO2 capture performance of the sorbent. Incorporation of Titania into the Boehmite-supported K2CO3 enhanced its CO2 capture capacity significantly. Therefore, further study of this novel fabrication technique is highly recommended. In the second phase of this research project, the CO2 capture performance of the sorbents in fixed and fluidized bed reactors will be investigated.Keywords: CO₂ capture, core/shell support, K₂CO₃, post-combustion
Procedia PDF Downloads 15037813 Testing Method of Soil Failure Pattern of Sand Type as an Effort to Minimize the Impact of the Earthquake
Authors: Luthfi Assholam Solamat
Abstract:
Nowadays many people do not know the soil failure pattern as an important part in planning the under structure caused by the loading occurs. This is because the soil is located under the foundation, so it cannot be seen directly. Based on this study, the idea occurs to do a study for testing the soil failure pattern, especially the type of sand soil under the foundation. The necessity of doing this to the design of building structures on the land which is the initial part of the foundation structure that met with waves/vibrations during an earthquake. If the underground structure is not strong it is feared the building thereon more vulnerable to the risk of building damage. This research focuses on the search of soil failure pattern, which the most applicable in the field with the loading periodic re-testing of a particular time with the help of the integrated video visual observations performed. The results could be useful for planning under the structure in an effort to try the upper structure is minimal risk of the earthquake.Keywords: soil failure pattern, earthquake, under structure, sand soil testing method
Procedia PDF Downloads 35837812 A Spatial Hypergraph Based Semi-Supervised Band Selection Method for Hyperspectral Imagery Semantic Interpretation
Authors: Akrem Sellami, Imed Riadh Farah
Abstract:
Hyperspectral imagery (HSI) typically provides a wealth of information captured in a wide range of the electromagnetic spectrum for each pixel in the image. Hence, a pixel in HSI is a high-dimensional vector of intensities with a large spectral range and a high spectral resolution. Therefore, the semantic interpretation is a challenging task of HSI analysis. We focused in this paper on object classification as HSI semantic interpretation. However, HSI classification still faces some issues, among which are the following: The spatial variability of spectral signatures, the high number of spectral bands, and the high cost of true sample labeling. Therefore, the high number of spectral bands and the low number of training samples pose the problem of the curse of dimensionality. In order to resolve this problem, we propose to introduce the process of dimensionality reduction trying to improve the classification of HSI. The presented approach is a semi-supervised band selection method based on spatial hypergraph embedding model to represent higher order relationships with different weights of the spatial neighbors corresponding to the centroid of pixel. This semi-supervised band selection has been developed to select useful bands for object classification. The presented approach is evaluated on AVIRIS and ROSIS HSIs and compared to other dimensionality reduction methods. The experimental results demonstrate the efficacy of our approach compared to many existing dimensionality reduction methods for HSI classification.Keywords: dimensionality reduction, hyperspectral image, semantic interpretation, spatial hypergraph
Procedia PDF Downloads 30637811 Nursing System Development in Patients Undergoing Operation in 3C Ward: Early Ambulation in Patients with Head and Neck Cancer
Authors: Artitaya Sabangbal, Darawan Augsornwan, Palakorn Surakunprapha, Lalida Petphai
Abstract:
Background: Srinagarind Hospital Ward 3C has about 180 cases of patients with head and neck cancer per year. Almost all of these patients suffer with pain, fatigue, low self image, swallowing problem and when the tumor is larger they will have breathing problem. Many of them have complication after operation such as pressure sore, pneumonia, deep vein thrombosis. Nursing activity is very important to prevent the complication especially promoting patients early ambulation. The objective of this study was to develop early ambulation protocol for patients with head and neck cancer undergoing operation. Method: this study is one part of nursing system development in patients undergoing operation in Ward 3C. It is a participation action research divided into 3 phases Phase 1 Situation review: In this phase we review the clinical outcomes, process of care, from document such as nurses note and interview nurses, patients and family about early ambulation. Phase 2 Searching nursing intervention about early ambulation from previous study then establish protocol . This phase we have picture package of early ambulation. Phase 3 implementation and evaluation. Result: Patients with head and neck cancer after operation can follow early ambulation protocol 100%, 85 % of patients can follow protocol within 2 days after operation and 100% can follow protocol within 3 days. No complications occur. Patients satisfaction in very good level is 58% and in good level is 42% Length of hospital stay is 6 days in patients with wide excision and 16 day in patients with flap coverage. Conclusion: The early ambulation protocol is appropriate for patients with head and neck cancer who undergo operation. This can restore physical health, reduce complication and increase patients satisfaction.Keywords: nursing system, early ambulation, head and neck cancer, operation
Procedia PDF Downloads 22937810 Velma-ARC’s Rehabilitation of Repentant Cybercriminals in Nigeria
Authors: Umukoro Omonigho Simon, Ashaolu David ‘Diya, Aroyewun-Olaleye Temitope Folashade
Abstract:
The VELMA Action to Reduce Cybercrime (ARC) is an initiative, the first of its kind in Nigeria, designed to identify, rehabilitate and empower repentant cybercrime offenders popularly known as ‘yahoo boys’ in Nigerian parlance. Velma ARC provides social inclusion boot camps with the goal of rehabilitating cybercriminals via psychotherapeutic interventions, improving their IT skills, and empowering them to make constructive contributions to society. This report highlights the psychological interventions provided for participants of the maiden edition of the Velma ARC boot camp and presents the outcomes of these interventions. The boot camp was set up in a hotel premises which was booked solely for the 1 month event. The participants were selected and invited via the Velma online recruitment portal based on an objective double-blind selection process from a pool of potential participants who signified interest via the registration portal. The participants were first taken through psychological profiling (personality, symptomology and psychopathology) before the individual and group sessions began. They were profiled using the Minnesota Multiphasic Personality Inventory -2- Restructured Form (MMPI-2-RF), the latest version of its series. Individual psychotherapy sessions were conducted for all participants based on what was interpreted on their profiles. Focus group discussion was held later to discuss a movie titled ‘catch me if you can’ directed by Steven Spielberg, featuring Leonardo De Caprio and Tom Hanks. The movie was based on the true life story of Frank Abagnale, who was a notorious scammer and con artist in his youthful years. Emergent themes from the movie were discussed as psycho-educative parameters for the participants. The overall evaluation of outcomes from the VELMA ARC rehabilitation boot camp stemmed from a disaggregated assessment of observed changes which are summarized in the final report of the clinical psychologist and was detailed enough to infer genuine repentance and positive change in attitude towards cybercrime among the participants. Follow up services were incorporated to validate initial observations. This gives credence to the potency of the psycho-educative intervention provided during the Velma ARC boot camp. It was recommended that support and collaborations from the government and other agencies/individuals would assist the VELMA foundation in expanding the scope and quality of the Velma ARC initiative as an additional requirement for cybercrime offenders following incarceration.Keywords: Velma-ARC, cybercrime offenders, rehabilitation, Nigeria
Procedia PDF Downloads 15337809 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks
Authors: Ahmed Abdullah Ahmed
Abstract:
The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments
Procedia PDF Downloads 51237808 Improving Fake News Detection Using K-means and Support Vector Machine Approaches
Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy
Abstract:
Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine
Procedia PDF Downloads 17637807 Incremental Learning of Independent Topic Analysis
Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda
Abstract:
In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.Keywords: text mining, topic extraction, independent, incremental, independent component analysis
Procedia PDF Downloads 30937806 Best Combination of Design Parameters for Buildings with Buckling-Restrained Braces
Authors: Ángel de J. López-Pérez, Sonia E. Ruiz, Vanessa A. Segovia
Abstract:
Buildings vulnerability due to seismic activity has been highly studied since the middle of last century. As a solution to the structural and non-structural damage caused by intense ground motions, several seismic energy dissipating devices, such as buckling-restrained braces (BRB), have been proposed. BRB have shown to be effective in concentrating a large portion of the energy transmitted to the structure by the seismic ground motion. A design approach for buildings with BRB elements, which is based on a seismic Displacement-Based formulation, has recently been proposed by the coauthors in this paper. It is a practical and easy design method which simplifies the work of structural engineers. The method is used here for the design of the structure-BRB damper system. The objective of the present study is to extend and apply a methodology to find the best combination of design parameters on multiple-degree-of-freedom (MDOF) structural frame – BRB systems, taking into account simultaneously: 1) initial costs and 2) an adequate engineering demand parameter. The design parameters considered here are: the stiffness ratio (α = Kframe/Ktotal), and the strength ratio (γ = Vdamper/Vtotal); where K represents structural stiffness and V structural strength; and the subscripts "frame", "damper" and "total" represent: the structure without dampers, the BRB dampers and the total frame-damper system, respectively. The selection of the best combination of design parameters α and γ is based on an initial costs analysis and on the structural dynamic response of the structural frame-damper system. The methodology is applied to a 12-story 5-bay steel building with BRB, which is located on the intermediate soil of Mexico City. It is found the best combination of design parameters α and γ for the building with BRB under study.Keywords: best combination of design parameters, BRB, buildings with energy dissipating devices, buckling-restrained braces, initial costs
Procedia PDF Downloads 25837805 Analysis of the Level of Production Failures by Implementing New Assembly Line
Authors: Joanna Kochanska, Dagmara Gornicka, Anna Burduk
Abstract:
The article examines the process of implementing a new assembly line in a manufacturing enterprise of the household appliances industry area. At the initial stages of the project, a decision was made that one of its foundations should be the concept of lean management. Because of that, eliminating as many errors as possible in the first phases of its functioning was emphasized. During the start-up of the line, there were identified and documented all production losses (from serious machine failures, through any unplanned downtime, to micro-stops and quality defects). During 6 weeks (line start-up period), all errors resulting from problems in various areas were analyzed. These areas were, among the others, production, logistics, quality, and organization. The aim of the work was to analyze the occurrence of production failures during the initial phase of starting up the line and to propose a method for determining their critical level during its full functionality. There was examined the repeatability of the production losses in various areas and at different levels at such an early stage of implementation, by using the methods of statistical process control. Based on the Pareto analysis, there were identified the weakest points in order to focus improvement actions on them. The next step was to examine the effectiveness of the actions undertaken to reduce the level of recorded losses. Based on the obtained results, there was proposed a method for determining the critical failures level in the studied areas. The developed coefficient can be used as an alarm in case of imbalance of the production, which is caused by the increased failures level in production and production support processes in the period of the standardized functioning of the line.Keywords: production failures, level of production losses, new production line implementation, assembly line, statistical process control
Procedia PDF Downloads 12837804 Analysis of Labor Effectiveness at Green Tea Dry Sorting Workstation for Increasing Tea Factory Competitiveness
Authors: Bayu Anggara, Arita Dewi Nugrahini, Didik Purwadi
Abstract:
Dry sorting workstation needs labor to produce green tea in Gambung Tea Factory. Observation results show that there is labor who are not working at the moment and doing overtime jobs to meet production targets. The measurement of the level of labor effectiveness has never been done before. The purpose of this study is to determine the level of labor effectiveness and provide recommendations for improvement based on the results of the Pareto diagram and Ishikawa diagram. The method used to measure the level of labor effectiveness is Overall Labor Effectiveness (OLE). OLE had three indicators which are availability, performance, and quality. Recommendations are made based on the results of the Pareto diagram and Ishikawa diagram for indicators that do not meet world standards. Based on the results of the study, the OLE value was 68.19%. Recommendations given to improve labor performance are adding mechanics, rescheduling rest periods, providing special training for labor, and giving rewards to labor. Furthermore, the recommendations for improving the quality of labor are procuring water content measuring devices, create material standard policies, and rescheduling rest periods.Keywords: Ishikawa diagram, labor effectiveness, OLE, Pareto diagram
Procedia PDF Downloads 22937803 Efficient DNN Training on Heterogeneous Clusters with Pipeline Parallelism
Abstract:
Pipeline parallelism has been widely used to accelerate distributed deep learning to alleviate GPU memory bottlenecks and to ensure that models can be trained and deployed smoothly under limited graphics memory conditions. However, in highly heterogeneous distributed clusters, traditional model partitioning methods are not able to achieve load balancing. The overlap of communication and computation is also a big challenge. In this paper, HePipe is proposed, an efficient pipeline parallel training method for highly heterogeneous clusters. According to the characteristics of the neural network model pipeline training task, oriented to the 2-level heterogeneous cluster computing topology, a training method based on the 2-level stage division of neural network modeling and partitioning is designed to improve the parallelism. Additionally, a multi-forward 1F1B scheduling strategy is designed to accelerate the training time of each stage by executing the computation units in advance to maximize the overlap between the forward propagation communication and backward propagation computation. Finally, a dynamic recomputation strategy based on task memory requirement prediction is proposed to improve the fitness ratio of task and memory, which improves the throughput of the cluster and solves the memory shortfall problem caused by memory differences in heterogeneous clusters. The empirical results show that HePipe improves the training speed by 1.6×−2.2× over the existing asynchronous pipeline baselines.Keywords: pipeline parallelism, heterogeneous cluster, model training, 2-level stage partitioning
Procedia PDF Downloads 1837802 Comparison of DPC and FOC Vector Control Strategies on Reducing Harmonics Caused by Nonlinear Load in the DFIG Wind Turbine
Authors: Hamid Havasi, Mohamad Reza Gholami Dehbalaei, Hamed Khorami, Shahram Karimi, Hamdi Abdi
Abstract:
Doubly-fed induction generator (DFIG) equipped with a power converter is an efficient tool for converting mechanical energy of a variable speed system to a fixed-frequency electrical grid. Since electrical energy sources faces with production problems such as harmonics caused by nonlinear loads, so in this paper, compensation performance of DPC and FOC method on harmonics reduction of a DFIG wind turbine connected to a nonlinear load in MATLAB Simulink model has been simulated and effect of each method on nonlinear load harmonic elimination has been compared. Results of the two mentioned control methods shows the advantage of the FOC method on DPC method for harmonic compensation. Also, the fifth and seventh harmonic components of the network and THD greatly reduced.Keywords: DFIG machine, energy conversion, nonlinear load, THD, DPC, FOC
Procedia PDF Downloads 58937801 A Prediction Method for Large-Size Event Occurrences in the Sandpile Model
Authors: S. Channgam, A. Sae-Tang, T. Termsaithong
Abstract:
In this research, the occurrences of large size events in various system sizes of the Bak-Tang-Wiesenfeld sandpile model are considered. The system sizes (square lattice) of model considered here are 25×25, 50×50, 75×75 and 100×100. The cross-correlation between the ratio of sites containing 3 grain time series and the large size event time series for these 4 system sizes are also analyzed. Moreover, a prediction method of the large-size event for the 50×50 system size is also introduced. Lastly, it can be shown that this prediction method provides a slightly higher efficiency than random predictions.Keywords: Bak-Tang-Wiesenfeld sandpile model, cross-correlation, avalanches, prediction method
Procedia PDF Downloads 38137800 Numerical Modelling of Skin Tumor Diagnostics through Dynamic Thermography
Authors: Luiz Carlos Wrobel, Matjaz Hribersek, Jure Marn, Jurij Iljaz
Abstract:
Dynamic thermography has been clinically proven to be a valuable diagnostic technique for skin tumor detection as well as for other medical applications such as breast cancer diagnostics, diagnostics of vascular diseases, fever screening, dermatological and other applications. Thermography for medical screening can be done in two different ways, observing the temperature response under steady-state conditions (passive or static thermography), and by inducing thermal stresses by cooling or heating the observed tissue and measuring the thermal response during the recovery phase (active or dynamic thermography). The numerical modelling of heat transfer phenomena in biological tissue during dynamic thermography can aid the technique by improving process parameters or by estimating unknown tissue parameters based on measured data. This paper presents a nonlinear numerical model of multilayer skin tissue containing a skin tumor, together with the thermoregulation response of the tissue during the cooling-rewarming processes of dynamic thermography. The model is based on the Pennes bioheat equation and solved numerically by using a subdomain boundary element method which treats the problem as axisymmetric. The paper includes computational tests and numerical results for Clark II and Clark IV tumors, comparing the models using constant and temperature-dependent thermophysical properties, which showed noticeable differences and highlighted the importance of using a local thermoregulation model.Keywords: boundary element method, dynamic thermography, static thermography, skin tumor diagnostic
Procedia PDF Downloads 10737799 Computer Simulations of Stress Corrosion Studies of Quartz Particulate Reinforced ZA-27 Metal Matrix Composites
Authors: K. Vinutha
Abstract:
The stress corrosion resistance of ZA-27 / TiO2 metal matrix composites (MMC’s) in high temperature acidic media has been evaluated using an autoclave. The liquid melt metallurgy technique using vortex method was used to fabricate MMC’s. TiO2 particulates of 50-80 µm in size are added to the matrix. ZA-27 containing 2,4,6 weight percentage of TiO2 are prepared. Stress corrosion tests were conducted by weight loss method for different exposure time, normality and temperature of the acidic medium. The corrosion rates of composites were lower to that of matrix ZA-27 alloy under all conditions.Keywords: autoclave, MMC’s, stress corrosion, vortex method
Procedia PDF Downloads 47637798 A Cognitive Approach to the Optimization of Power Distribution across an Educational Campus
Authors: Mrinmoy Majumder, Apu Kumar Saha
Abstract:
The ever-increasing human population and its demand for energy is placing stress upon conventional energy sources; and as demand for power continues to outstrip supply, the need to optimize energy distribution and utilization is emerging as an important focus for various stakeholders. The distribution of available energy must be achieved in such a way that the needs of the consumer are satisfied. However, if the availability of resources is not sufficient to satisfy consumer demand, it is necessary to find a method to select consumers based on factors such as their socio-economic or environmental impacts. Weighting consumer types in this way can help separate them based on their relative importance, and cognitive optimization of the allocation process can then be carried out so that, even on days of particularly scarce supply, the socio-economic impacts of not satisfying the needs of consumers can be minimized. In this context, the present study utilized fuzzy logic to assign weightage to different types of consumers based at an educational campus in India, and then established optimal allocation by applying the non-linear mapping capability of neuro-genetic algorithms. The outputs of the algorithms were compared with similar outputs from particle swarm optimization and differential evolution algorithms. The results of the study demonstrate an option for the optimal utilization of available energy based on the socio-economic importance of consumers.Keywords: power allocation, optimization problem, neural networks, environmental and ecological engineering
Procedia PDF Downloads 47937797 Main Chaos-Based Image Encryption Algorithm
Authors: Ibtissem Talbi
Abstract:
During the last decade, a variety of chaos-based cryptosystems have been investigated. Most of them are based on the structure of Fridrich, which is based on the traditional confusion-diffusion architecture proposed by Shannon. Compared with traditional cryptosystems (DES, 3DES, AES, etc.), the chaos-based cryptosystems are more flexible, more modular and easier to be implemented, which make them suitable for large scale-data encyption, such as images and videos. The heart of any chaos-based cryptosystem is the chaotic generator and so, a part of the efficiency (robustness, speed) of the system depends greatly on it. In this talk, we give an overview of the state of the art of chaos-based block ciphers and we describe some of our schemes already proposed. Also we will focus on the essential characteristics of the digital chaotic generator, The needed performance of a chaos-based block cipher in terms of security level and speed of calculus depends on the considered application. There is a compromise between the security and the speed of the calculation. The security of these block block ciphers will be analyzed.Keywords: chaos-based cryptosystems, chaotic generator, security analysis, structure of Fridrich
Procedia PDF Downloads 68437796 Robust Pattern Recognition via Correntropy Generalized Orthogonal Matching Pursuit
Authors: Yulong Wang, Yuan Yan Tang, Cuiming Zou, Lina Yang
Abstract:
This paper presents a novel sparse representation method for robust pattern classification. Generalized orthogonal matching pursuit (GOMP) is a recently proposed efficient sparse representation technique. However, GOMP adopts the mean square error (MSE) criterion and assign the same weights to all measurements, including both severely and slightly corrupted ones. To reduce the limitation, we propose an information-theoretic GOMP (ITGOMP) method by exploiting the correntropy induced metric. The results show that ITGOMP can adaptively assign small weights on severely contaminated measurements and large weights on clean ones, respectively. An ITGOMP based classifier is further developed for robust pattern classification. The experiments on public real datasets demonstrate the efficacy of the proposed approach.Keywords: correntropy induced metric, matching pursuit, pattern classification, sparse representation
Procedia PDF Downloads 35537795 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure
Authors: T. Nozu, K. Hibi, T. Nishiie
Abstract:
This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure
Procedia PDF Downloads 24437794 Pricing European Continuous-Installment Options under Regime-Switching Models
Authors: Saghar Heidari
Abstract:
In this paper, we study the valuation problem of European continuous-installment options under Markov-modulated models with a partial differential equation approach. Due to the opportunity for continuing or stopping to pay installments, the valuation problem under regime-switching models can be formulated as coupled partial differential equations (CPDE) with free boundary features. To value the installment options, we express the truncated CPDE as a linear complementarity problem (LCP), then a finite element method is proposed to solve the resulted variational inequality. Under some appropriate assumptions, we establish the stability of the method and illustrate some numerical results to examine the rate of convergence and accuracy of the proposed method for the pricing problem under the regime-switching model.Keywords: continuous-installment option, European option, regime-switching model, finite element method
Procedia PDF Downloads 13737793 Reconceptualizing “Best Practices” in Public Sector
Authors: Eftychia Kessopoulou, Styliani Xanthopoulou, Ypatia Theodorakioglou, George Tsiotras, Katerina Gotzamani
Abstract:
Public sector managers frequently herald that implementing best practices as a set of standards, may lead to superior organizational performance. However, recent research questions the objectification of best practices, highlighting: a) the inability of public sector organizations to develop innovative administrative practices, as well as b) the adoption of stereotypical renowned practices inculcated in the public sector by international governance bodies. The process through which organizations construe what a best practice is, still remains a black box that is yet to be investigated, given the trend of continuous changes in public sector performance, as well as the burgeoning interest of sharing popular administrative practices put forward by international bodies. This study aims to describe and understand how organizational best practices are constructed by public sector performance management teams, like benchmarkers, during the benchmarking-mediated performance improvement process and what mechanisms enable this construction. A critical realist action research methodology is employed, starting from a description of various approaches on best practice nature when a benchmarking-mediated performance improvement initiative, such as the Common Assessment Framework, is applied. Firstly, we observed the benchmarker’s management process of best practices in a public organization, so as to map their theories-in-use. As a second step we contextualized best administrative practices by reflecting the different perspectives emerged from the previous stage on the design and implementation of an interview protocol. We used this protocol to conduct 30 semi-structured interviews with “best practice” process owners, in order to examine their experiences and performance needs. Previous research on best practices has shown that needs and intentions of benchmarkers cannot be detached from the causal mechanisms of the various contexts in which they work. Such causal mechanisms can be found in: a) process owner capabilities, b) the structural context of the organization, and c) state regulations. Therefore, we developed an interview protocol theoretically informed in the first part to spot causal mechanisms suggested by previous research studies and supplemented it with questions regarding the provision of best practice support from the government. Findings of this work include: a) a causal account of the nature of best administrative practices in the Greek public sector that shed light on explaining their management, b) a description of the various contexts affecting best practice conceptualization, and c) a description of how their interplay changed the organization’s best practice management.Keywords: benchmarking, action research, critical realism, best practices, public sector
Procedia PDF Downloads 12737792 Water-Repellent Coating Based on Thermoplastic Polyurethane, Silica Nanoparticles and Graphene Nanoplatelets
Authors: S. Naderizadeh, A. Athanassiou, I. S. Bayer
Abstract:
This work describes a layer-by-layer spraying method to produce a non-wetting coating, based on thermoplastic polyurethane (TPU) and silica nanoparticles (Si-NPs). The main purpose of this work was to transform a hydrophilic polymer to superhydrophobic coating. The contact angle of pure TPU was measured about 77˚ ± 2, and water droplets did not roll away upon tilting even at 90°. But after applying a layer of Si-NPs on top of this, not only the contact angle increased to 165˚ ± 2, but also water droplets can roll away even below 5˚ tilting. The most important restriction in this study was the weak interfacial adhesion between polymer and nanoparticles, which had a bad effect on durability of the coatings. To overcome this problem, we used a very thin layer of graphene nanoplatelets (GNPs) as an interlayer between TPU and Si-NPs layers, followed by thermal treatment at 150˚C. The sample’s morphology and topography were characterized by scanning electron microscopy (SEM), EDX analysis and atomic force microscopy (AFM). It was observed that Si-NPs embedded into the polymer phase in the presence of GNPs layer. It is probably because of the high surface area and considerable thermal conductivity of the graphene platelets. The contact angle value for the sample containing graphene decreased a little bit respected to the coating without graphene and reached to 156.4˚ ± 2, due to the depletion of the surface roughness. The durability of the coatings against abrasion was evaluated by Taber® abrasion test, and it was observed that superhydrophobicity of the coatings remains for a longer time, in the presence of GNPs layer. Due to the simple fabrication method and good durability of the coating, this coating can be used as a durable superhydrophobic coating for metals and can be produced in large scale.Keywords: graphene, silica nanoparticles, superhydrophobicity, thermoplastic polyurethane
Procedia PDF Downloads 18637791 An Examination of Earnings Management by Publicly Listed Targets Ahead of Mergers and Acquisitions
Authors: T. Elrazaz
Abstract:
This paper examines accrual and real earnings management by publicly listed targets around mergers and acquisitions. Prior literature shows that earnings management around mergers and acquisitions can have a significant economic impact because of the associated wealth transfers among stakeholders. More importantly, acting on behalf of their shareholders or pursuing their self-interests, managers of both targets and acquirers may be equally motivated to manipulate earnings prior to an acquisition to generate higher gains for their shareholders or themselves. Building on the grounds of information asymmetry, agency conflicts, stewardship theory, and the revelation principle, this study addresses the question of whether takeover targets employ accrual and real earnings management in the periods prior to the announcement of Mergers and Acquisitions (M&A). Additionally, this study examines whether acquirers are able to detect targets’ earnings management, and in response, adjust the acquisition premium paid in order not to face the risk of overpayment. This study uses an aggregate accruals approach in estimating accrual earnings management as proxied by estimated abnormal accruals. Additionally, real earnings management is proxied for by employing widely used models in accounting and finance literature. The results of this study indicate that takeover targets manipulate their earnings using accruals in the second year with an earnings release prior to the announcement of the M&A. Moreover, in partitioning the sample of targets according to the method of payment used in the deal, the results are restricted only to targets of stock-financed deals. These results are consistent with the argument that targets of cash-only or mixed-payment deals do not have the same strong motivations to manage their earnings as their stock-financed deals counterparts do additionally supporting the findings of prior studies that the method of payment in takeovers is value relevant. The findings of this study also indicate that takeover targets manipulate earnings upwards through cutting discretionary expenses the year prior to the acquisition while they do not do so by manipulating sales or production costs. Moreover, in partitioning the sample of targets according to the method of payment used in the deal, the results are restricted only to targets of stock-financed deals, providing further robustness to the results derived under the accrual-based models. Finally, this study finds evidence suggesting that acquirers are fully aware of the accrual-based techniques employed by takeover targets and can unveil such manipulation practices. These results are robust to alternative accrual and real earnings management proxies, as well as controlling for the method of payment in the deal.Keywords: accrual earnings management, acquisition premium, real earnings management, takeover targets
Procedia PDF Downloads 11537790 Scheduling Method for Electric Heater in HEMS considering User’s Comfort
Authors: Yong-Sung Kim, Je-Seok Shin, Ho-Jun Jo, Jin-O Kim
Abstract:
Home Energy Management System (HEMS) which makes the residential consumers contribute to the demand response is attracting attention in recent years. An aim of HEMS is to minimize their electricity cost by controlling the use of their appliances according to electricity price. The use of appliances in HEMS may be affected by some conditions such as external temperature and electricity price. Therefore, the user’s usage pattern of appliances should be modeled according to the external conditions, and the resultant usage pattern is related to the user’s comfortability on use of each appliances. This paper proposes a methodology to model the usage pattern based on the historical data with the copula function. Through copula function, the usage range of each appliance can be obtained and is able to satisfy the appropriate user’s comfort according to the external conditions for next day. Within the usage range, an optimal scheduling for appliances would be conducted so as to minimize an electricity cost with considering user’s comfort. Among the home appliance, electric heater (EH) is a representative appliance which is affected by the external temperature. In this paper, an optimal scheduling algorithm for an electric heater (EH) is addressed based on the method of branch and bound. As a result, scenarios for the EH usage are obtained according to user’s comfort levels and then the residential consumer would select the best scenario. The case study shows the effects of the proposed algorithm compared with the traditional operation of the EH, and it also represents impacts of the comfort level on the scheduling result.Keywords: load scheduling, usage pattern, user’s comfort, copula function, branch and bound, electric heater
Procedia PDF Downloads 58537789 The 5-HT1A Receptor Biased Agonists, NLX-101 and NLX-204, Elicit Rapid-Acting Antidepressant Activity in Rat Similar to Ketamine and via GABAergic Mechanisms
Authors: A. Newman-Tancredi, R. Depoortère, P. Gruca, E. Litwa, M. Lason, M. Papp
Abstract:
The N-methyl-D-aspartic acid (NMDA) receptor antagonist, ketamine, can elicit rapid-acting antidepressant (RAAD) effects in treatment-resistant patients, but it requires parenteral co-administration with a classical antidepressant under medical supervision. In addition, ketamine can also produce serious side effects that limit its long-term use, and there is much interest in identifying RAADs based on ketamine’s mechanism of action but with safer profiles. Ketamine elicits GABAergic interneuron inhibition, glutamatergic neuron stimulation, and, notably, activation of serotonin 5-HT1A receptors in the prefrontal cortex (PFC). Direct activation of the latter receptor subpopulation with selective ‘biased agonists’ may therefore be a promising strategy to identify novel RAADs and, consistent with this hypothesis, the prototypical cortical biased agonist, NLX-101, exhibited robust RAAD-like activity in the chronic mild stress model of depression (CMS). The present study compared the effects of a novel, selective 5-HT1A receptor-biased agonist, NLX-204, with those of ketamine and NLX-101. Materials and methods: CMS procedure was conducted on Wistar rats; drugs were administered either intraperitoneally (i.p.) or by bilateral intracortical microinjection. Ketamine: 10 mg/kg i.p. or 10 µg/side in PFC; NLX-204 and NLX-101: 0.08 and 0.16 mg/kg i.p. or 16 µg/side in PFC. In addition, interaction studies were carried out with systemic NLX-204 or NLX-101 (each at 0.16 mg/kg i.p.) in combination with intracortical WAY-100635 (selective 5-HT1A receptor antagonist; 2 µg/side) or muscimol (GABA-A receptor agonist, 12.5 ng/side). Anhedonia was assessed by CMS-induced decrease in sucrose solution consumption; anxiety-like behavior was assessed using the Elevated Plus Maze (EPM), and cognitive impairment was assessed by the Novel Object Recognition (NOR) test. Results: A single administration of NLX-204 was sufficient to reverse the CMS-induced deficit in sucrose consumption, similarly to ketamine and NLX-101. NLX-204 also reduced CMS-induced anxiety in the EPM and abolished CMS-induced NOR deficits. These effects were maintained (EPM and NOR) or enhanced (sucrose consumption) over a subsequent 2-week period of treatment. The anti-anhedonic response of the drugs was also maintained for several weeks Following treatment discontinuation, suggesting that they had sustained effects on neuronal networks. A single PFC administration of NLX-204 reversed deficient sucrose consumption, similarly to ketamine and NLX-101. Moreover, the anti-anhedonic activities of systemic NLX-204 and NLX 101 were abolished by coadministration with intracortical WAY-100635 or muscimol. Conclusions: (i) The antidepressant-like activity of NLX-204 in the rat CMS model was as rapid as that of ketamine or NLX-101, supporting targeting cortical 5-HT1A receptors with selective, biased agonists to achieve RAAD effects. (ii)The anti-anhedonic activity of systemic NLX-204 was mimicked by local administration of the compound in the PFC, confirming the involvement of cortical circuits in its RAAD-like effects. (iii) Notably, the effects of systemic NLX-204 and NLX-101 were abolished by PFC administration of muscimol, indicating that they act by (indirectly) eliciting a reduction in cortical GABAergic neurotransmission. This is consistent with ketamine’s mechanism of action and suggests that there are converging NMDA and 5-HT1A receptor signaling cascades in PFC underlying the RAAD-like activities of ketamine and NLX-204. Acknowledgements: The study was financially supported by NCN grant no. 2019/35/B/NZ7/00787.Keywords: depression, ketamine, serotonin, 5-HT1A receptor, chronic mild stress
Procedia PDF Downloads 11237788 The Study on Mechanical Properties of Graphene Using Molecular Mechanics
Authors: I-Ling Chang, Jer-An Chen
Abstract:
The elastic properties and fracture of two-dimensional graphene were calculated purely from the atomic bonding (stretching and bending) based on molecular mechanics method. Considering the representative unit cell of graphene under various loading conditions, the deformations of carbon bonds and the variations of the interlayer distance could be realized numerically under the geometry constraints and minimum energy assumption. In elastic region, it was found that graphene was in-plane isotropic. Meanwhile, the in-plane deformation of the representative unit cell is not uniform along armchair direction due to the discrete and non-uniform distributions of the atoms. The fracture of graphene could be predicted using fracture criteria based on the critical bond length, over which the bond would break. It was noticed that the fracture behavior were directional dependent, which was consistent with molecular dynamics simulation results.Keywords: energy minimization, fracture, graphene, molecular mechanics
Procedia PDF Downloads 40237787 Divergence Regularization Method for Solving Ill-Posed Cauchy Problem for the Helmholtz Equation
Authors: Benedict Barnes, Anthony Y. Aidoo
Abstract:
A Divergence Regularization Method (DRM) is used to regularize the ill-posed Helmholtz equation where the boundary deflection is inhomogeneous in a Hilbert space H. The DRM incorporates a positive integer scaler which homogenizes the inhomogeneous boundary deflection in Cauchy problem of the Helmholtz equation. This ensures the existence, as well as, uniqueness of solution for the equation. The DRM restores all the three conditions of well-posedness in the sense of Hadamard.Keywords: divergence regularization method, Helmholtz equation, ill-posed inhomogeneous Cauchy boundary conditions
Procedia PDF Downloads 18937786 Development of the Food Market of the Republic of Kazakhstan in the Field of Milk Processing
Authors: Gulmira Zhakupova, Tamara Tultabayeva, Aknur Muldasheva, Assem Sagandyk
Abstract:
The development of technology and production of products with increased biological value based on the use of natural food raw materials are important tasks in the policy of the food market of the Republic of Kazakhstan. For Kazakhstan, livestock farming, in particular sheep farming, is the most ancient and developed industry and way of life. The history of the Kazakh people is largely connected with this type of agricultural production, with established traditions using dairy products from sheep's milk. Therefore, the development of new technologies from sheep’s milk remains relevant. In addition, one of the most promising areas for the development of food technology for therapeutic and prophylactic purposes is sheep milk products as a source of protein, immunoglobulins, minerals, vitamins, and other biologically active compounds. This article presents the results of research on the study of milk processing technology. The objective of the study is to study the possibilities of processing sheep milk and its role in human nutrition, as well as the results of research to improve the technology of sheep milk products. The studies were carried out on the basis of sanitary and hygienic requirements for dairy products in accordance with the following test methods. To perform microbiological analysis, we used the method for identifying Salmonella bacteria (Horizontal method for identifying, counting, and serotyping Salmonella) in a certain mass or volume of product. Nutritional value is a complex of properties of food products that meet human physiological needs for energy and basic nutrients. The protein mass fraction was determined by the Kjeldahl method. This method is based on the mineralization of a milk sample with concentrated sulfuric acid in the presence of an oxidizing agent, an inert salt - potassium sulfate, and a catalyst - copper sulfate. In this case, the amino groups of the protein are converted into ammonium sulfate dissolved in sulfuric acid. The vitamin composition was determined by HPLC. To determine the content of mineral substances in the studied samples, the method of atomic absorption spectrophotometry was used. The study identified the technological parameters of sheep milk products and determined the prospects for researching sheep milk products. Microbiological studies were used to determine the safety of the study product. According to the results of the microbiological analysis, no deviations from the norm were identified. This means high safety of the products under study. In terms of nutritional value, the resulting products are high in protein. Data on the positive content of amino acids were also obtained. The results obtained will be used in the food industry and will serve as recommendations for manufacturers.Keywords: dairy, milk processing, nutrition, colostrum
Procedia PDF Downloads 57