Abstract:
The landmark Recognition of any urban structure is a frustrating task to accomplish. This paper presents a deep learning based approach for recognition of the urban struc...Show MoreMetadata
Abstract:
The landmark Recognition of any urban structure is a frustrating task to accomplish. This paper presents a deep learning based approach for recognition of the urban structures. Urban structures are basically the usual offices, houses, apartments etc. A generalized word for all these buildings can be urban structures. This classification problem was identified by our team while making our way into the visual positioning system, we saw that the visual landmark recognition of urban structures was an unexplored domain by using deep learning approaches. We also Identified that unlike the landmarks such as the Eiffel Tower, Statue Of Liberty etc, urban structures were difficult to learn by a neural network, the data requirements were quite distinct compared to the ones mentioned for all these famous landmarks around the globe. Most of all the landmark database was created, augmented, labelled and preprocessed in lab. The transfer learning based approaches were not that useful since the overall size of the model would always be so enormous and integration of a transfer learning based model would be meaningless for any deployed web app or even for a local mobile based app. By collecting and engineering our own data using a data collection app, we have proposed a neural architecture which can be configured to learn multiple number of urban landmarks while retaining its overall size under 60MBs. This was achieved by a Convolutional Neural Network based binary classification model trained on landmarks existing in our university. Later this was deployed on a web based application and further on it is being made into an app.
Published in: 2020 3rd International Conference on Computing, Mathematics and Engineering Technologies (iCoMET)
Date of Conference: 29-30 January 2020
Date Added to IEEE Xplore: 23 April 2020
ISBN Information: