Search results for: double side approach method
30017 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images
Authors: U. Datta
Abstract:
The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection
Procedia PDF Downloads 13730016 A Two-Stage Bayesian Variable Selection Method with the Extension of Lasso for Geo-Referenced Data
Authors: Georgiana Onicescu, Yuqian Shen
Abstract:
Due to the complex nature of geo-referenced data, multicollinearity of the risk factors in public health spatial studies is a commonly encountered issue, which leads to low parameter estimation accuracy because it inflates the variance in the regression analysis. To address this issue, we proposed a two-stage variable selection method by extending the least absolute shrinkage and selection operator (Lasso) to the Bayesian spatial setting, investigating the impact of risk factors to health outcomes. Specifically, in stage I, we performed the variable selection using Bayesian Lasso and several other variable selection approaches. Then, in stage II, we performed the model selection with only the selected variables from stage I and compared again the methods. To evaluate the performance of the two-stage variable selection methods, we conducted a simulation study with different distributions for the risk factors, using geo-referenced count data as the outcome and Michigan as the research region. We considered the cases when all candidate risk factors are independently normally distributed, or follow a multivariate normal distribution with different correlation levels. Two other Bayesian variable selection methods, Binary indicator, and the combination of Binary indicator and Lasso were considered and compared as alternative methods. The simulation results indicated that the proposed two-stage Bayesian Lasso variable selection method has the best performance for both independent and dependent cases considered. When compared with the one-stage approach, and the other two alternative methods, the two-stage Bayesian Lasso approach provides the highest estimation accuracy in all scenarios considered.Keywords: Lasso, Bayesian analysis, spatial analysis, variable selection
Procedia PDF Downloads 14730015 A Two-Step Framework for Unsupervised Speaker Segmentation Using BIC and Artificial Neural Network
Authors: Ahmad Alwosheel, Ahmed Alqaraawi
Abstract:
This work proposes a new speaker segmentation approach for two speakers. It is an online approach that does not require a prior information about speaker models. It has two phases, a conventional approach such as unsupervised BIC-based is utilized in the first phase to detect speaker changes and train a Neural Network, while in the second phase, the output trained parameters from the Neural Network are used to predict next incoming audio stream. Using this approach, a comparable accuracy to similar BIC-based approaches is achieved with a significant improvement in terms of computation time.Keywords: artificial neural network, diarization, speaker indexing, speaker segmentation
Procedia PDF Downloads 50630014 Thermo-Aeraulic Studies of a Multizone Building Influence of the Compactness Index
Authors: S. M. A. Bekkouche, T. Benouaz, M. K. Cherier, M. Hamdani, M. R. Yaiche, N. Benamrane
Abstract:
Most codes of building energy simulation neglect the humidity or well represent it with a very simplified method. It is for this reason that we have developed a new approach to the description and modeling of multizone buildings in Saharan climate. The thermal nodal method was used to apprehend thermoaeraulic behavior of air subjected to varied solicitations. In this contribution, analyzing the building geometry introduced the concept of index compactness as "quotient of external walls area and volume of the building". Physical phenomena that we have described in this paper, allow to build the model of the coupled thermoaeraulic behavior. The comparison shows that the found results are to some extent satisfactory. The result proves that temperature and specific humidity depending on compactness and geometric shape. Proper use of compactness index and building geometry parameters will noticeably minimize building energy.Keywords: multizone model, nodal method, compactness index, specific humidity, temperature
Procedia PDF Downloads 41130013 Finding Related Scientific Documents Using Formal Concept Analysis
Authors: Nadeem Akhtar, Hira Javed
Abstract:
An important aspect of research is literature survey. Availability of a large amount of literature across different domains triggers the need for optimized systems which provide relevant literature to researchers. We propose a search system based on keywords for text documents. This experimental approach provides a hierarchical structure to the document corpus. The documents are labelled with keywords using KEA (Keyword Extraction Algorithm) and are automatically organized in a lattice structure using Formal Concept Analysis (FCA). This groups the semantically related documents together. The hierarchical structure, based on keywords gives out only those documents which precisely contain them. This approach open doors for multi-domain research. The documents across multiple domains which are indexed by similar keywords are grouped together. A hierarchical relationship between keywords is obtained. To signify the effectiveness of the approach, we have carried out the experiment and evaluation on Semeval-2010 Dataset. Results depict that the presented method is considerably successful in indexing of scientific papers.Keywords: formal concept analysis, keyword extraction algorithm, scientific documents, lattice
Procedia PDF Downloads 33430012 A Robust Optimization Method for Service Quality Improvement in Health Care Systems under Budget Uncertainty
Authors: H. Ashrafi, S. Ebrahimi, H. Kamalzadeh
Abstract:
With the development of business competition, it is important for healthcare providers to improve their service qualities. In order to improve service quality of a clinic, four important dimensions are defined: tangibles, responsiveness, empathy, and reliability. Moreover, there are several service stages in hospitals such as financial screening and examination. One of the most challenging limitations for improving service quality is budget which impressively affects the service quality. In this paper, we present an approach to address budget uncertainty and provide guidelines for service resource allocation. In this paper, a service quality improvement approach is proposed which can be adopted to multistage service processes to improve service quality, while controlling the costs. A multi-objective function based on the importance of each area and dimension is defined to link operational variables to service quality dimensions. The results demonstrate that our approach is not ultra-conservative and it shows the actual condition very well. Moreover, it is shown that different strategies can affect the number of employees in different stages.Keywords: allocation, budget uncertainty, healthcare resource, service quality assessment, robust optimization
Procedia PDF Downloads 18630011 Deficiency Risk in Islamic and Conventional Banks
Authors: Korbi Fakhri
Abstract:
The management of assets and liability is a vital task for every bank as far as a good direction allows its stability; however, a bad running forewarns its disappearance. Equity of a bank is among the most important rubrics in the liability side because, actually, these funds ensure three notably primordial functions for the survival of the bank. From one hand, equity is useful to bankroll the investments and cover the unexpected losses. From another hand, they attract the fund lessors since they inspire trust. So we are going to tackle some points including whether equity of the Islamic banks are oversized. In spite of the efforts made on the subject, the relationship between the capital and the deficiency probability has not been defined with certainty. In this article, we have elaborated a study over the nature of financial intermediation in Islamic banks by comparison to those of conventional ones. We have found a striking difference between two kinds of intermediation. We tried, from another side, to study the relationship between the capital level and deficiency risk relying on econometric model, and we have obtained a positive and significant relation between the capital and the deficiency risk for the conventional banks. This means that when the capital of these banks increases, the deficiency risk increases as well. In return, since the Islamic banks are constrained to respect the Sharia Committee as well as customers’ demands who may, in certain contracts, choose to invest their capitals in projects they are interested in. These constraints have as effects to reduce the deficiency risk even when the capital increases.Keywords: Islamic bank, conventional bank, deficiency risk, financial intermediation
Procedia PDF Downloads 39330010 Untapped Market of Islamic Pension Fund: Muslim Attitude and Expectation
Authors: Yunice Karina Tumewang
Abstract:
As we have seen, the number of Muslim and their awareness toward financial products and services that conform to Islamic principles are growing rapidly today. Thus, it makes the market environment potentially beneficial for Shari-compliant funds with the expanding prospective client base. However, over the last decade, only small portion of this huge potential market has been covered by the established Islamic asset management firms. This study aims to examine the factors of this untapped market particularly in the demand side. This study will use the qualitative method with primary data through a questionnaire distributed to 500 samples of Muslim population. It will shed light on Muslim attitudes and expectations toward Sharia-compliant retirement planning and pensions. It will also help to raise the awareness of market players to see Islamic pension fund as a promising industry in the foreseeable future.Keywords: Islamic marketing, Islamic finance, Islamic asset management, Islamic pension fund
Procedia PDF Downloads 33830009 New Methods to Acquire Grammatical Skills in A Foreign Language
Authors: Indu ray
Abstract:
In today’s digital world the internet is already flooded with information on how to master grammar in a foreign language. It is well known that one cannot master a language without grammar. Grammar is the backbone of any language. Without grammar there would be no structure to help you speak/write or listen/read. Successful communication is only possible if the form and function of linguistic utterances are firmly related to one another. Grammar has its own rules of use to formulate an easier-to-understand language. Like a tool, grammar formulates our thoughts and knowledge in a meaningful way. Every language has its own grammar. With grammar, we can quickly analyze whether there is any action in this text: (Present, past, future). Knowledge of grammar is an important prerequisite for mastering a foreign language. What’s most important is how teachers can make grammar lessons more interesting for students and thus promote grammar skills more successfully. Through this paper, we discuss a few important methods like (Interactive Grammar Exercises between students, Interactive Grammar Exercise between student to teacher, Grammar translation method, Audio -Visual Method, Deductive Method, Inductive Method). This paper is divided into two sections. In the first part, brief definitions and principles of these approaches will be provided. Then the possibility and the case of combination of this approach will be analyzed. In the last section of the paper, I would like to present a survey result conducted at my university on a few methods to quickly learn grammar in Foreign Language. We divided the Grammatical Skills in six Parts. 1.Grammatical Competence 2. Speaking Skills 3. Phonology 4. The syntax and the Semantics 5. Rule 6. Cognitive Function and conducted a survey among students. From our survey results, we can observe that phonology, speaking ability, syntax and semantics can be improved by inductive method, Audio-visual Method, and grammatical translation method, for grammar rules and cognitive functions we should choose IGE (teacher-student) method. and the IGE method (pupil-pupil). The study’s findings revealed, that the teacher delivery Methods should be blend or fusion based on the content of the Grammar.Keywords: innovative method, grammatical skills, audio-visual, translation
Procedia PDF Downloads 7830008 Design Flood Estimation in Satluj Basin-Challenges for Sunni Dam Hydro Electric Project, Himachal Pradesh-India
Authors: Navneet Kalia, Lalit Mohan Verma, Vinay Guleria
Abstract:
Introduction: Design Flood studies are essential for effective planning and functioning of water resource projects. Design flood estimation for Sunni Dam Hydro Electric Project located in State of Himachal Pradesh, India, on the river Satluj, was a big challenge in view of the river flowing in the Himalayan region from Tibet to India, having a large catchment area of varying topography, climate, and vegetation. No Discharge data was available for the part of the river in Tibet, whereas, for India, it was available only at Khab, Rampur, and Luhri. The estimation of Design Flood using standard methods was not possible. This challenge was met using two different approaches for upper (snow-fed) and lower (rainfed) catchment using Flood Frequency Approach and Hydro-metrological approach. i) For catchment up to Khab Gauging site (Sub-Catchment, C1), Flood Frequency approach was used. Around 90% of the catchment area (46300 sqkm) up to Khab is snow-fed which lies above 4200m. In view of the predominant area being snow-fed area, 1 in 10000 years return period flood estimated using Flood Frequency analysis at Khab was considered as Probable Maximum Flood (PMF). The flood peaks were taken from daily observed discharges at Khab, which were increased by 10% to make them instantaneous. Design Flood of 4184 cumec thus obtained was considered as PMF at Khab. ii) For catchment between Khab and Sunni Dam (Sub-Catchment, C2), Hydro-metrological approach was used. This method is based upon the catchment response to the rainfall pattern observed (Probable Maximum Precipitation - PMP) in a particular catchment area. The design flood computation mainly involves the estimation of a design storm hyetograph and derivation of the catchment response function. A unit hydrograph is assumed to represent the response of the entire catchment area to a unit rainfall. The main advantage of the hydro-metrological approach is that it gives a complete flood hydrograph which allows us to make a realistic determination of its moderation effect while passing through a reservoir or a river reach. These studies were carried out to derive PMF for the catchment area between Khab and Sunni Dam site using a 1-day and 2-day PMP values of 232 and 416 cm respectively. The PMF so obtained was 12920.60 cumec. Final Result: As the Catchment area up to Sunni Dam has been divided into 2 sub-catchments, the Flood Hydrograph for the Catchment C1 has been routed through the connecting channel reach (River Satluj) using Muskingum method and accordingly, the Design Flood was computed after adding the routed flood ordinates with flood ordinates of catchment C2. The total Design Flood (i.e. 2-Day PMF) with a peak of 15473 cumec was obtained. Conclusion: Even though, several factors are relevant while deciding the method to be used for design flood estimation, data availability and the purpose of study are the most important factors. Since, generally, we cannot wait for the hydrological data of adequate quality and quantity to be available, flood estimation has to be done using whatever data is available. Depending upon the type of data available for a particular catchment, the method to be used is to be selected.Keywords: design flood, design storm, flood frequency, PMF, PMP, unit hydrograph
Procedia PDF Downloads 33030007 Lines for a Different Approach in Music Education: A Review of the Concept of Musicality
Authors: Emmanuel Carlos De Mata Castrejón
Abstract:
Music education has shown to be connected to many areas of sciences and arts, it has also been associated with several facets of human life. The many aspects around the study of music and education, make very difficult for the music educator to find a way through, even though there are lots of methods of teaching music to young children, they are different between one another and so are the students. For the music to help improve children’s development, it is necessary for the children to explore their musicality as they explore their creativity; it must be a challenging, playful, and enjoyable activity. The purpose of this investigation is to focus the music education not in the music, nor the teaching, but the children to be guided through their own musicality. The first approach to this kind of music education comes from the Active learning methods during the nineteenth century, most of which are still used around the world, sometimes with modifications to fit a certain place or type of students. This approach on children’s musicality requires some knowledge of music, pedagogy, and developmental psychology at least, but more important than the theory or the method used for music education, the focus should be on developing the student’s musicality, considering the complexity of this concept. To get this, it is needed, indeed, far more research in the topic, so this is a call for collaborative research and for interdisciplinary teams to emerge. This is a review of authors and methods in music education trying to trace a line pointing to transdisciplinary work and pursuing the development of children’s musicality.Keywords: children, methods, music education, musicality
Procedia PDF Downloads 33530006 An Unbiased Profiling of Immune Repertoire via Sequencing and Analyzing T-Cell Receptor Genes
Authors: Yi-Lin Chen, Sheng-Jou Hung, Tsunglin Liu
Abstract:
Adaptive immune system recognizes a wide range of antigens via expressing a large number of structurally distinct T cell and B cell receptor genes. The distinct receptor genes arise from complex rearrangements called V(D)J recombination, and constitute the immune repertoire. A common method of profiling immune repertoire is via amplifying recombined receptor genes using multiple primers and high-throughput sequencing. This multiplex-PCR approach is efficient; however, the resulting repertoire can be distorted because of primer bias. To eliminate primer bias, 5’ RACE is an alternative amplification approach. However, the application of RACE approach is limited by its low efficiency (i.e., the majority of data are non-regular receptor sequences, e.g., containing intronic segments) and lack of the convenient tool for analysis. We propose a computational tool that can correctly identify non-regular receptor sequences in RACE data via aligning receptor sequences against the whole gene instead of only the exon regions as done in all other tools. Using our tool, the remaining regular data allow for an accurate profiling of immune repertoire. In addition, a RACE approach is improved to yield a higher fraction of regular T-cell receptor sequences. Finally, we quantify the degree of primer bias of a multiplex-PCR approach via comparing it to the RACE approach. The results reveal significant differences in frequency of VJ combination by the two approaches. Together, we provide a new experimental and computation pipeline for an unbiased profiling of immune repertoire. As immune repertoire profiling has many applications, e.g., tracing bacterial and viral infection, detection of T cell lymphoma and minimal residual disease, monitoring cancer immunotherapy, etc., our work should benefit scientists who are interested in the applications.Keywords: immune repertoire, T-cell receptor, 5' RACE, high-throughput sequencing, sequence alignment
Procedia PDF Downloads 19630005 Applying the Crystal Model Approach on Light Nuclei for Calculating Radii and Density Distribution
Authors: A. Amar
Abstract:
A new model, namely the crystal model, has been modified to calculate the radius and density distribution of light nuclei up to ⁸Be. The crystal model has been modified according to solid-state physics, which uses the analogy between nucleon distribution and atoms distribution in the crystal. The model has analytical analysis to calculate the radius where the density distribution of light nuclei has obtained from analogy of crystal lattice. The distribution of nucleons over crystal has been discussed in a general form. The equation that has been used to calculate binding energy was taken from the solid-state model of repulsive and attractive force. The numbers of the protons were taken to control repulsive force, where the atomic number was responsible for the attractive force. The parameter has been calculated from the crystal model was found to be proportional to the radius of the nucleus. The density distribution of light nuclei was taken as a summation of two clusters distribution as in ⁶Li=alpha+deuteron configuration. A test has been done on the data obtained for radius and density distribution using double folding for d+⁶,⁷Li with M3Y nucleon-nucleon interaction. Good agreement has been obtained for both the radius and density distribution of light nuclei. The model failed to calculate the radius of ⁹Be, so modifications should be done to overcome discrepancy.Keywords: nuclear physics, nuclear lattice, study nucleus as crystal, light nuclei till to ⁸Be
Procedia PDF Downloads 17830004 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 34730003 Pyramidal Lucas-Kanade Optical Flow Based Moving Object Detection in Dynamic Scenes
Authors: Hyojin Lim, Cuong Nguyen Khac, Yeongyu Choi, Ho-Youl Jung
Abstract:
In this paper, we propose a simple moving object detection, which is based on motion vectors obtained from pyramidal Lucas-Kanade optical flow. The proposed method detects moving objects such as pedestrians, the other vehicles and some obstacles at the front-side of the host vehicle, and it can provide the warning to the driver. Motion vectors are obtained by using pyramidal Lucas-Kanade optical flow, and some outliers are eliminated by comparing the amplitude of each vector with the pre-defined threshold value. The background model is obtained by calculating the mean and the variance of the amplitude of recent motion vectors in the rectangular shaped local region called the cell. The model is applied as the reference to classify motion vectors of moving objects and those of background. Motion vectors are clustered to rectangular regions by using the unsupervised clustering K-means algorithm. Labeling method is applied to label groups which is close to each other, using by distance between each center points of rectangular. Through the simulations tested on four kinds of scenarios such as approaching motorbike, vehicle, and pedestrians to host vehicle, we prove that the proposed is simple but efficient for moving object detection in parking lots.Keywords: moving object detection, dynamic scene, optical flow, pyramidal optical flow
Procedia PDF Downloads 35130002 An In-Depth Analysis of the Implementation of 'I SMILE Happy Classroom' to Achieve Ideological and Political Integration
Authors: Jinhuang Zhang
Abstract:
This study focuses on traditional English courses in the context of globalization. The basic methodology involves the application of the "I SMILE Happy Classroom" teaching approach. The major findings reveal that compared to traditional courses with issues such as lack of ideological and political integration, this method successfully incorporates ideological and political elements in teaching content. It transforms the classroom into a student-centered, interactive, engaging, and responsive environment that highly integrates ideological and political integration. In conclusion, the "I SMILE Happy Classroom" teaching method shows great potential in addressing the pain points of traditional English courses and enhancing the quality and effectiveness of English teaching in terms of ideological and political integration.Keywords: English course, ideological and political elements, "I SMILE Happy Classroom" teaching method, teaching pain points
Procedia PDF Downloads 530001 Value Engineering and Its Impact on Drainage Design Optimization for Penang International Airport Expansion
Authors: R.M. Asyraf, A. Norazah, S.M. Khairuddin, B. Noraziah
Abstract:
Designing a system at present requires a vital, challenging task; to ensure the design philosophy is maintained in economical ways. This paper perceived the value engineering (VE) approach applied in infrastructure works, namely stormwater drainage. This method is adopted in line as consultants have completed the detailed design. Function Analysis System Technique (FAST) diagram and VE job plan, information, function analysis, creative judgement, development, and recommendation phase are used to scrutinize the initial design of stormwater drainage. An estimated cost reduction using the VE approach of 2% over the initial proposal was obtained. This cost reduction is obtained from the design optimization of the drainage foundation and structural system, where the pile design and drainage base structure are optimized. Likewise, the design of the on-site detention tank (OSD) pump was revised and contribute to the cost reduction obtained. This case study shows that the VE approach can be an important tool in optimizing the design to reduce costs.Keywords: value engineering, function analysis system technique, stormwater drainage, cost reduction
Procedia PDF Downloads 14830000 Polymeric Microspheres for Bone Tissue Engineering
Authors: Yamina Boukari, Nashiru Billa, Andrew Morris, Stephen Doughty, Kevin Shakesheff
Abstract:
Poly (lactic-co-glycolic) acid (PLGA) is a synthetic polymer that can be used in bone tissue engineering with the aim of creating a scaffold in order to support the growth of cells. The formation of microspheres from this polymer is an attractive strategy that would allow for the development of an injectable system, hence avoiding invasive surgical procedures. The aim of this study was to develop a microsphere delivery system for use as an injectable scaffold in bone tissue engineering and evaluate various formulation parameters on its properties. Porous and lysozyme-containing PLGA microspheres were prepared using the double emulsion solvent evaporation method from various molecular weights (MW). Scaffolds were formed by sintering to contain 1 -3mg of lysozyme per gram of scaffold. The mechanical and physical properties of the scaffolds were assessed along with the release of lysozyme, which was used as a model protein. The MW of PLGA was found to have an influence on microsphere size during fabrication, with increased MW leading to an increased microsphere diameter. An inversely proportional relationship was displayed between PLGA MW and mechanical strength of formed scaffolds across loadings for low, intermediate and high MW respectively. Lysozyme release from both microspheres and formed scaffolds showed an initial burst release phase, with both microspheres and scaffolds fabricated using high MW PLGA showing the lowest protein release. Following the initial burst phase, the profiles for each MW followed a similar slow release over 30 days. Overall, the results of this study demonstrate that lysozyme can be successfully incorporated into porous PLGA scaffolds and released over 30 days in vitro, and that varying the MW of the PLGA can be used as a method of altering the physical properties of the resulting scaffolds.Keywords: bone, microspheres, PLGA, tissue engineering
Procedia PDF Downloads 42529999 Subarray Based Multiuser Massive MIMO Design Adopting Large Transmit and Receive Arrays
Authors: Tetsiki Taniguchi, Yoshio Karasawa
Abstract:
This paper describes a subarray based low computational design method of multiuser massive multiple input multiple output (MIMO) system. In our previous works, use of large array is assumed only in transmitter, but this study considers the case both of transmitter and receiver sides are equipped with large array antennas. For this aim, receive arrays are also divided into several subarrays, and the former proposed method is modified for the synthesis of a large array from subarrays in both ends. Through computer simulations, it is verified that the performance of the proposed method is degraded compared with the original approach, but it can achieve the improvement in the aspect of complexity, namely, significant reduction of the computational load to the practical level.Keywords: large array, massive multiple input multiple output (MIMO), multiuser, singular value decomposition, subarray, zero forcing
Procedia PDF Downloads 40329998 Systematic Taxonomy and Phylogenetic of Commercial Fish Species of Family Nemipetridae from Malaysian Waters and Neighboring Seas
Authors: Ayesha Imtiaz, Darlina Md. Naim
Abstract:
Family Nemipteridae is among the most abundantly distributed family in Malaysian fish markets due to its high contribution to landing sites of Malaysia. Using an advanced molecular approach that used two mitochondrial (Cytochrome oxidase c I and Cytochrome oxidase b) and one nuclear gene (Recombination activating gene, RAGI) to expose cryptic diversity and phylogenetic relationships among commercially important species of family Nemipteridae. Our research covered all genera (including 31 species out total 45 species) of family Nemipteridae, distributed in Malaysia. We also found certain type of geographical barriers in the South China sea that reduces dispersal and stops a few species to intermix. Northside of the South China Sea (near Vietnam) does not allow genetic diversity to mix with the Southern side of the South China sea (Sarawak) and reduces dispersal. Straits of Malacca reduce the intermixing genetic diversity of South China Sea and the Indian Ocean.Keywords: Nemipteridae, RAG I, south east Asia, Malaysia
Procedia PDF Downloads 14529997 Planning for Brownfield Regeneration in Malaysia: An Integrated Approach in Creating Sustainable Ex-Landfill Redevelopment
Authors: Mazifah Simis, Azahan Awang, Kadir Arifin
Abstract:
The brownfield regeneration is being implemented in developped countries. However, as a group 1 developing country in the South East Asia, the rapid development and increasing number of urban population in Malaysia have urged the needs to incorporate the brownfield regeneration into its physical planning development. The increasing number of urban ex-landfills is seen as a new resource that could overcome the issues of inadequate urban green space provisions. With regards to the new development approach in urban planning, this perception study aims to identify the sustainable planning approach based on what the stakeholders have in mind. Respondents consist of 375 local communities within four urban ex-landfill areas and 61 landscape architect and town planner officers in the Malaysian Local Authorities. Three main objectives are set to be achieved, which are (i) to identify ex-landfill issues that need to be overcome prior to the ex-landfill redevelopment (ii) to identify the most suitable types of ex-landfill redevelopment, and (iii) to identify the priority function for ex-landfill redevelopment as the public parks. From the data gathered through the survey method, the order of priorities based on stakeholders' perception was produced. The results show different perception among the stakeholders, but they agreed to the development of the public park as the main development. Hence, this study attempts to produce an integrated approach as a model for sustainable ex-landfill redevelopment that could be accepted by the stakeholders as a beneficial future development that could change the image of 296 ex-landfills in Malaysia into the urban public parks by the year 2020.Keywords: brownfield regeneration, ex-landfill redevelopment, integrated approach, stakeholders' perception
Procedia PDF Downloads 35529996 Evaluation of the Golden Proportion and Golden Standard of Maxillary Anterior Teeth in Relation to Smile Attractiveness
Authors: Marwan Ahmed Swileh, Amal Hussein Abuaffan
Abstract:
Objective: This study aimed to explore the existence of golden proportion (GP) between the widths of maxillary anterior teeth and golden standard (GS) for width to height ratio of maxillary central incisor in individuals with attractive and non-attractive smiles. Materials and methods: A total of 82 females were recruited and divided into 2 groups: attractive smile (n= 41) and non-attractive smile (n= 41). Frontal photographs were taken, scanned, and saved on a personal computer. The apparent mesiodistal width of each anterior tooth was measured. The data were analyzed using the appropriate statistical tests at p-value < 0.05. Results: Frequency of GP was very low among the total sample, and most proportions were higher than GP. No significant differences were found between both groups in relation to central-to-lateral ratio while significant differences were found in relation to canine-to-lateral ratio. Similarly, most proportions of width to height ratio were higher than GS. Difference between groups was significant for left side and for both sides (p < 0.05) but was not for right side (p > 0.05). Conclusion: Frequency of golden proportion was very low among the study population. Smile attractiveness is not related that much to the proportions between the teeth.Keywords: golden proportion, golden standard, attractive smile, esthetic, anterior teeth
Procedia PDF Downloads 14629995 Financial Portfolio Optimization in Electricity Markets: Evaluation via Sharpe Ratio
Authors: F. Gökgöz, M. E. Atmaca
Abstract:
Electricity plays an indispensable role in human life and the economy. It is a unique product or service that must be balanced instantaneously, as electricity is not stored, generation and consumption should be proportional. Effective and efficient use of electricity is very important not only for society, but also for the environment. A competitive electricity market is one of the best ways to provide a suitable platform for effective and efficient use of electricity. On the other hand, it carries some risks that should be carefully managed by the market players. Risk management is an essential part in market players’ decision making. In this paper, risk management through diversification is applied with the help of Markowitz’s Mean-variance, Down-side and Semi-variance methods for a case study. Performance of optimal electricity sale solutions are measured and evaluated via Sharpe-Ratio, and the optimal portfolio solutions are improved. Two years of historical weekdays’ price data of the Turkish Day Ahead Market are used to demonstrate the approach.Keywords: electricity market, portfolio optimization, risk management in electricity market, sharpe ratio
Procedia PDF Downloads 36729994 Task Based Language Learning: A Paradigm Shift in ESL/EFL Teaching and Learning: A Case Study Based Approach
Authors: Zehra Sultan
Abstract:
The study is based on the task-based language teaching approach which is found to be very effective in the EFL/ESL classroom. This approach engages learners to acquire the usage of authentic language skills by interacting with the real world through sequence of pedagogical tasks. The use of technology enhances the effectiveness of this approach. This study throws light on the historical background of TBLT and its efficacy in the EFL/ESL classroom. In addition, this study precisely talks about the implementation of this approach in the General Foundation Programme of Muscat College, Oman. It furnishes the list of the pedagogical tasks embedded in the language curriculum of General Foundation Programme (GFP) which are skillfully allied to the College Graduate Attributes. Moreover, the study also discusses the challenges pertaining to this approach from the point of view of teachers, students, and its classroom application. Additionally, the operational success of this methodology is gauged through formative assessments of the GFP, which is apparent in the students’ progress.Keywords: task-based language teaching, authentic language, communicative approach, real world activities, ESL/EFL activities
Procedia PDF Downloads 12729993 Behind Fuzzy Regression Approach: An Exploration Study
Authors: Lavinia B. Dulla
Abstract:
The exploration study of the fuzzy regression approach attempts to present that fuzzy regression can be used as a possible alternative to classical regression. It likewise seeks to assess the differences and characteristics of simple linear regression and fuzzy regression using the width of prediction interval, mean absolute deviation, and variance of residuals. Based on the simple linear regression model, the fuzzy regression approach is worth considering as an alternative to simple linear regression when the sample size is between 10 and 20. As the sample size increases, the fuzzy regression approach is not applicable to use since the assumption regarding large sample size is already operating within the framework of simple linear regression. Nonetheless, it can be suggested for a practical alternative when decisions often have to be made on the basis of small data.Keywords: fuzzy regression approach, minimum fuzziness criterion, interval regression, prediction interval
Procedia PDF Downloads 30329992 An Essay on Origamic and Isomorphic Approach as Interface of Form in Architectural Basic Design Education
Authors: Gamze Atay, Altay Colak
Abstract:
It is a fact that today's technology shapes the change and development of architectural forms by creating different perspectives. The research is an experimental study that explores the integration of architectural forms in this process of change/development into design education through traditional design tools. An examination of the practices in the studio environment shows that the students who just started architectural education have difficulty accessing the form. The main objective of this study has been to enable students to use and interpret different disciplines in the design process to improve their perception of form. In this sense, the origami, which is defined as "the art of paper folding", and isomorphous (equally formed) approaches have been used with design studio students at the beginning stage as methods in the process of 3-dimensional thinking and creating the form. These two methods were examined with students in three stages: analysis, creation, and outcome. As a result of the study, it was seen that the use of different disciplines as a method during form creation gave the designs of the student originality, freedom, and dynamism.Keywords: architectural form, design education, isomorphic approach, origamic approach
Procedia PDF Downloads 15329991 Fuzzy Approach for Fault Tree Analysis of Water Tube Boiler
Authors: Syed Ahzam Tariq, Atharva Modi
Abstract:
This paper presents a probabilistic analysis of the safety of water tube boilers using fault tree analysis (FTA). A fault tree has been constructed by considering all possible areas where a malfunction could lead to a boiler accident. Boiler accidents are relatively rare, causing a scarcity of data. The fuzzy approach is employed to perform a quantitative analysis, wherein theories of fuzzy logic are employed in conjunction with expert elicitation to calculate failure probabilities. The Fuzzy Fault Tree Analysis (FFTA) provides a scientific and contingent method to forecast and prevent accidents.Keywords: fault tree analysis water tube boiler, fuzzy probability score, failure probability
Procedia PDF Downloads 12829990 DNA Double-Strand Break–Capturing Nuclear Envelope Tubules Drive DNA Repair
Authors: Mitra Shokrollahi, Mia Stanic, Anisha Hundal, Janet N. Y. Chan, Defne Urman, Chris A. Jordan, Anne Hakem, Roderic Espin, Jun Hao, Rehna Krishnan, Philipp G. Maass, Brendan C. Dickson, Manoor P. Hande, Miquel A. Pujana, Razqallah Hakem, Karim Mekhail
Abstract:
Current models suggest that DNA double-strand breaks (DSBs) can move to the nuclear periphery for repair. It is unclear to what extent human DSBs display such repositioning. Here we show that the human nuclear envelope localizes to DSBs in a manner depending on DNA damage response (DDR) kinases and cytoplasmic microtubules acetylated by α-tubulin acetyltransferase-1 (ATAT1). These factors collaborate with the linker of nucleoskeleton and cytoskeleton complex (LINC), nuclear pore complex (NPC) protein NUP153, the nuclear lamina and kinesins KIF5B and KIF13B to generate DSB-capturing nuclear envelope tubules (dsbNETs). dsbNETs are partly supported by nuclear actin filaments and the circadian factor PER1 and reversed by kinesin KIFC3. Although dsbNETs promote repair and survival, they are also co-opted during poly (ADP-ribose) polymerase (PARP) inhibition to restrain BRCA1-deficient breast cancer cells and are hyper-induced in cells expressing the aging-linked lamin A mutant progerin. In summary, our results advance understanding of nuclear structure-function relationships, uncover a nuclear-cytoplasmic DDR and identify dsbNETs as critical factors in genome organization and stability.Keywords: DNA damage response, genome stability, nuclear envelope, cancer, age-related disorders
Procedia PDF Downloads 2029989 Sewer Culvert Installation Method to Accommodate Underground Construction in an Urban Area with Narrow Streets
Authors: Osamu Igawa, Hiroshi Kouchiwa, Yuji Ito
Abstract:
In recent years, a reconstruction project for sewer pipelines has been progressing in Japan with the aim of renewing old sewer culverts. However, it is difficult to secure a sufficient base area for shafts in an urban area because many streets are narrow with a complex layout. As a result, construction in such urban areas is generally very demanding. In urban areas, there is a strong requirement for a safe, reliable and economical construction method that does not disturb the public’s daily life and urban activities. With this in mind, we developed a new construction method called the 'shield switching type micro-tunneling method' which integrates the micro-tunneling method and shield method. In this method, pipeline is constructed first for sections that are gently curved or straight using the economical micro-tunneling method, and then the method is switched to the shield method for sections with a sharp curve or a series of curves without establishing an intermediate shaft. This paper provides the information, features and construction examples of this newly developed method.Keywords: micro-tunneling method, secondary lining applied RC segment, sharp curve, shield method, switching type
Procedia PDF Downloads 40629988 Delay in Induction of Labour at Two Hospitals in Southeast Scotland: Outcomes
Authors: Bernard Ewuoso
Abstract:
Introduction: Induction of labor (IOL) usually involves the patient moving between antenatal, labor, and postnatal wards. Delay in IOL has been defined as delay in the time it takes a woman to wait for induction after her cervix is assessed to be favorable. Opinions vary on the acceptable time the patient is allowed to wait for once the cervix is adjudged ripe for induction. What has been considered a benchmark is a delay of up to 12 hours. There is evidence that delay in IOL is associated with adverse outcomes. Aim: To determine the number of women experiencing delay in induction of labor and their outcomes. Method: This audit was retrospective and observational. It included women who had induction of labor in the month of October 2023 in two hospitals. Clinical data was collected from electronic medical records into an Excel sheet for analysis. Women had cervical ripening as inpatient or outpatient. The primary objective was to determine the number of women experiencing delay in induction of labor, while the secondary objective was to outcome these women. Result: 136 women had IOL. The least percentage of data retrieved for any parameter was 80%. The mean gestational age at IOL was 278.26 days. The mean waiting time was 905.34mins. Seventy-five women had their IOL at the Royal Infirmary of Edinburgh (RIE), fifty-seven at St. John’s Hospital (SJH), and three women were transferred from RIE to SJH. The preferred method of cervical ripening was balloon closely followed by prostaglandin. Twenty-seven women did not require cervical ripening and had their process started with amniotomy. Prostaglandin was the method of choice of cervical ripening at RIE, while balloon was preferred in SJH. Of the thirty-five women found to be suitable for outpatient cervical ripening, thirteen had outpatient ripening. There was a significant increase in the number of women undergoing outpatient cervical ripening at RIE from 10.5% in April 2022 to 42.9%. The preferred method for outpatient cervical ripening at the RIE was balloon, while it was prostaglandin for SJH. These were contradictory to the preferred method of inpatient cervical ripening at both centers. The average waiting time for IOL at RIE, 1166.92mins, is more than double that of SJH, 442.93mins, and far exceed 12hours, which is the proposed benchmark. The waiting time tends to be shorter with prostaglandin. Out of the women that had outpatient cervical ripening 63.6% had to wait for more than 12hrs before being induced while it was 36.1% for women that had inpatient cervical ripening. Overall, 38.5% women waited for more than 12 hours before having their induction. A lesser proportion of the women who waited for more than 12 hours had caesarean section, assisted vaginal delivery, and postpartum hemorrhage, whereas a greater proportion had spontaneous vaginal delivery and intrapartum or postpartum infection. Conclusion: A significant number of the women included in the study experienced delay in their induction process, and this was associated with an increased occurrence of intrapartum or postpartum infection. Outpatient cervical ripening contributed to delay.Keywords: delay in induction of labor, inpatient, outpatient, intrapartum, postpartum, infection
Procedia PDF Downloads 24